[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 18445 1726882527.30105: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 18445 1726882527.30598: Added group all to inventory 18445 1726882527.30601: Added group ungrouped to inventory 18445 1726882527.30605: Group all now contains ungrouped 18445 1726882527.30609: Examining possible inventory source: /tmp/network-91m/inventory.yml 18445 1726882527.58026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 18445 1726882527.58203: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 18445 1726882527.58227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 18445 1726882527.58341: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 18445 1726882527.58537: Loaded config def from plugin (inventory/script) 18445 1726882527.58540: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 18445 1726882527.58586: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 18445 1726882527.58793: Loaded config def from plugin (inventory/yaml) 18445 1726882527.58795: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 18445 1726882527.58999: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 18445 1726882527.59929: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 18445 1726882527.59932: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 18445 1726882527.59935: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 18445 1726882527.59941: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 18445 1726882527.60065: Loading data from /tmp/network-91m/inventory.yml 18445 1726882527.60133: /tmp/network-91m/inventory.yml was not parsable by auto 18445 1726882527.60320: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 18445 1726882527.60361: Loading data from /tmp/network-91m/inventory.yml 18445 1726882527.60562: group all already in inventory 18445 1726882527.60571: set inventory_file for managed_node1 18445 1726882527.60575: set inventory_dir for managed_node1 18445 1726882527.60576: Added host managed_node1 to inventory 18445 1726882527.60578: Added host managed_node1 to group all 18445 1726882527.60579: set ansible_host for managed_node1 18445 1726882527.60580: set ansible_ssh_extra_args for managed_node1 18445 1726882527.60583: set inventory_file for managed_node2 18445 1726882527.60586: set inventory_dir for managed_node2 18445 1726882527.60587: Added host managed_node2 to inventory 18445 1726882527.60588: Added host managed_node2 to group all 18445 1726882527.60589: set ansible_host for managed_node2 18445 1726882527.60590: set ansible_ssh_extra_args for managed_node2 18445 1726882527.60592: set inventory_file for managed_node3 18445 1726882527.60595: set inventory_dir for managed_node3 18445 1726882527.60595: Added host managed_node3 to inventory 18445 1726882527.60597: Added host managed_node3 to group all 18445 1726882527.60597: set ansible_host for managed_node3 18445 1726882527.60598: set ansible_ssh_extra_args for managed_node3 18445 1726882527.60714: Reconcile groups and hosts in inventory. 18445 1726882527.60719: Group ungrouped now contains managed_node1 18445 1726882527.60721: Group ungrouped now contains managed_node2 18445 1726882527.60723: Group ungrouped now contains managed_node3 18445 1726882527.60806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 18445 1726882527.61162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 18445 1726882527.61212: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 18445 1726882527.61241: Loaded config def from plugin (vars/host_group_vars) 18445 1726882527.61243: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 18445 1726882527.61249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 18445 1726882527.61374: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 18445 1726882527.61417: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 18445 1726882527.62101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882527.62311: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 18445 1726882527.62470: Loaded config def from plugin (connection/local) 18445 1726882527.62473: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 18445 1726882527.63660: Loaded config def from plugin (connection/paramiko_ssh) 18445 1726882527.63665: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 18445 1726882527.65780: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18445 1726882527.65819: Loaded config def from plugin (connection/psrp) 18445 1726882527.65822: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 18445 1726882527.67498: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18445 1726882527.67538: Loaded config def from plugin (connection/ssh) 18445 1726882527.67541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 18445 1726882527.71716: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18445 1726882527.71876: Loaded config def from plugin (connection/winrm) 18445 1726882527.71879: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 18445 1726882527.71911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 18445 1726882527.72092: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 18445 1726882527.72166: Loaded config def from plugin (shell/cmd) 18445 1726882527.72168: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 18445 1726882527.72308: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 18445 1726882527.72410: Loaded config def from plugin (shell/powershell) 18445 1726882527.72412: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 18445 1726882527.72469: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 18445 1726882527.72887: Loaded config def from plugin (shell/sh) 18445 1726882527.72889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 18445 1726882527.72923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 18445 1726882527.73609: Loaded config def from plugin (become/runas) 18445 1726882527.73611: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 18445 1726882527.73998: Loaded config def from plugin (become/su) 18445 1726882527.74001: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 18445 1726882527.74400: Loaded config def from plugin (become/sudo) 18445 1726882527.74402: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 18445 1726882527.74435: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml 18445 1726882527.75184: in VariableManager get_vars() 18445 1726882527.75206: done with get_vars() 18445 1726882527.75442: trying /usr/local/lib/python3.12/site-packages/ansible/modules 18445 1726882527.82022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 18445 1726882527.82260: in VariableManager get_vars() 18445 1726882527.82268: done with get_vars() 18445 1726882527.82271: variable 'playbook_dir' from source: magic vars 18445 1726882527.82272: variable 'ansible_playbook_python' from source: magic vars 18445 1726882527.82272: variable 'ansible_config_file' from source: magic vars 18445 1726882527.82273: variable 'groups' from source: magic vars 18445 1726882527.82274: variable 'omit' from source: magic vars 18445 1726882527.82275: variable 'ansible_version' from source: magic vars 18445 1726882527.82276: variable 'ansible_check_mode' from source: magic vars 18445 1726882527.82277: variable 'ansible_diff_mode' from source: magic vars 18445 1726882527.82277: variable 'ansible_forks' from source: magic vars 18445 1726882527.82278: variable 'ansible_inventory_sources' from source: magic vars 18445 1726882527.82279: variable 'ansible_skip_tags' from source: magic vars 18445 1726882527.82280: variable 'ansible_limit' from source: magic vars 18445 1726882527.82280: variable 'ansible_run_tags' from source: magic vars 18445 1726882527.82281: variable 'ansible_verbosity' from source: magic vars 18445 1726882527.82377: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 18445 1726882527.83809: in VariableManager get_vars() 18445 1726882527.83824: done with get_vars() 18445 1726882527.83981: in VariableManager get_vars() 18445 1726882527.84002: done with get_vars() 18445 1726882527.84037: in VariableManager get_vars() 18445 1726882527.84048: done with get_vars() 18445 1726882527.84194: in VariableManager get_vars() 18445 1726882527.84206: done with get_vars() 18445 1726882527.84395: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18445 1726882527.84857: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18445 1726882527.84985: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18445 1726882527.86655: in VariableManager get_vars() 18445 1726882527.86679: done with get_vars() 18445 1726882527.87542: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 18445 1726882527.87796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18445 1726882527.90405: in VariableManager get_vars() 18445 1726882527.90423: done with get_vars() 18445 1726882527.90659: in VariableManager get_vars() 18445 1726882527.90780: done with get_vars() 18445 1726882527.90783: variable 'playbook_dir' from source: magic vars 18445 1726882527.90784: variable 'ansible_playbook_python' from source: magic vars 18445 1726882527.90784: variable 'ansible_config_file' from source: magic vars 18445 1726882527.90785: variable 'groups' from source: magic vars 18445 1726882527.90786: variable 'omit' from source: magic vars 18445 1726882527.90787: variable 'ansible_version' from source: magic vars 18445 1726882527.90787: variable 'ansible_check_mode' from source: magic vars 18445 1726882527.90788: variable 'ansible_diff_mode' from source: magic vars 18445 1726882527.90789: variable 'ansible_forks' from source: magic vars 18445 1726882527.90789: variable 'ansible_inventory_sources' from source: magic vars 18445 1726882527.90790: variable 'ansible_skip_tags' from source: magic vars 18445 1726882527.90791: variable 'ansible_limit' from source: magic vars 18445 1726882527.90792: variable 'ansible_run_tags' from source: magic vars 18445 1726882527.90792: variable 'ansible_verbosity' from source: magic vars 18445 1726882527.90824: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 18445 1726882527.91005: in VariableManager get_vars() 18445 1726882527.91009: done with get_vars() 18445 1726882527.91010: variable 'playbook_dir' from source: magic vars 18445 1726882527.91011: variable 'ansible_playbook_python' from source: magic vars 18445 1726882527.91012: variable 'ansible_config_file' from source: magic vars 18445 1726882527.91013: variable 'groups' from source: magic vars 18445 1726882527.91014: variable 'omit' from source: magic vars 18445 1726882527.91014: variable 'ansible_version' from source: magic vars 18445 1726882527.91015: variable 'ansible_check_mode' from source: magic vars 18445 1726882527.91016: variable 'ansible_diff_mode' from source: magic vars 18445 1726882527.91016: variable 'ansible_forks' from source: magic vars 18445 1726882527.91017: variable 'ansible_inventory_sources' from source: magic vars 18445 1726882527.91018: variable 'ansible_skip_tags' from source: magic vars 18445 1726882527.91019: variable 'ansible_limit' from source: magic vars 18445 1726882527.91019: variable 'ansible_run_tags' from source: magic vars 18445 1726882527.91020: variable 'ansible_verbosity' from source: magic vars 18445 1726882527.91048: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 18445 1726882527.91242: in VariableManager get_vars() 18445 1726882527.91253: done with get_vars() 18445 1726882527.91295: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18445 1726882527.91522: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18445 1726882527.91721: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18445 1726882527.92570: in VariableManager get_vars() 18445 1726882527.92590: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18445 1726882527.95873: in VariableManager get_vars() 18445 1726882527.96029: done with get_vars() 18445 1726882527.96068: in VariableManager get_vars() 18445 1726882527.96071: done with get_vars() 18445 1726882527.96073: variable 'playbook_dir' from source: magic vars 18445 1726882527.96074: variable 'ansible_playbook_python' from source: magic vars 18445 1726882527.96075: variable 'ansible_config_file' from source: magic vars 18445 1726882527.96075: variable 'groups' from source: magic vars 18445 1726882527.96076: variable 'omit' from source: magic vars 18445 1726882527.96077: variable 'ansible_version' from source: magic vars 18445 1726882527.96078: variable 'ansible_check_mode' from source: magic vars 18445 1726882527.96078: variable 'ansible_diff_mode' from source: magic vars 18445 1726882527.96079: variable 'ansible_forks' from source: magic vars 18445 1726882527.96080: variable 'ansible_inventory_sources' from source: magic vars 18445 1726882527.96081: variable 'ansible_skip_tags' from source: magic vars 18445 1726882527.96081: variable 'ansible_limit' from source: magic vars 18445 1726882527.96082: variable 'ansible_run_tags' from source: magic vars 18445 1726882527.96083: variable 'ansible_verbosity' from source: magic vars 18445 1726882527.96114: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 18445 1726882527.96299: in VariableManager get_vars() 18445 1726882527.96310: done with get_vars() 18445 1726882527.96465: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18445 1726882527.96698: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18445 1726882527.96889: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18445 1726882527.97800: in VariableManager get_vars() 18445 1726882527.97818: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18445 1726882528.00861: in VariableManager get_vars() 18445 1726882528.00877: done with get_vars() 18445 1726882528.01026: in VariableManager get_vars() 18445 1726882528.01038: done with get_vars() 18445 1726882528.01156: in VariableManager get_vars() 18445 1726882528.01171: done with get_vars() 18445 1726882528.01355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 18445 1726882528.01371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 18445 1726882528.04277: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 18445 1726882528.04457: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 18445 1726882528.04460: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 18445 1726882528.04497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 18445 1726882528.04523: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 18445 1726882528.04703: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 18445 1726882528.04767: Loaded config def from plugin (callback/default) 18445 1726882528.04770: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18445 1726882528.05954: Loaded config def from plugin (callback/junit) 18445 1726882528.05956: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18445 1726882528.06001: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 18445 1726882528.06072: Loaded config def from plugin (callback/minimal) 18445 1726882528.06074: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18445 1726882528.06113: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18445 1726882528.06215: Loaded config def from plugin (callback/tree) 18445 1726882528.06218: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 18445 1726882528.06419: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 18445 1726882528.06421: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_initscripts.yml *************************************** 10 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml 18445 1726882528.06484: in VariableManager get_vars() 18445 1726882528.06496: done with get_vars() 18445 1726882528.06502: in VariableManager get_vars() 18445 1726882528.06510: done with get_vars() 18445 1726882528.06514: variable 'omit' from source: magic vars 18445 1726882528.07785: in VariableManager get_vars() 18445 1726882528.07802: done with get_vars() 18445 1726882528.07823: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with initscripts as provider] *** 18445 1726882528.09100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 18445 1726882528.09199: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 18445 1726882528.09230: getting the remaining hosts for this loop 18445 1726882528.09232: done getting the remaining hosts for this loop 18445 1726882528.09235: getting the next task for host managed_node1 18445 1726882528.09238: done getting next task for host managed_node1 18445 1726882528.09240: ^ task is: TASK: Gathering Facts 18445 1726882528.09242: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882528.09249: getting variables 18445 1726882528.09250: in VariableManager get_vars() 18445 1726882528.09260: Calling all_inventory to load vars for managed_node1 18445 1726882528.09263: Calling groups_inventory to load vars for managed_node1 18445 1726882528.09267: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882528.09298: Calling all_plugins_play to load vars for managed_node1 18445 1726882528.09313: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882528.09316: Calling groups_plugins_play to load vars for managed_node1 18445 1726882528.09349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882528.09749: done with get_vars() 18445 1726882528.09756: done getting variables 18445 1726882528.09828: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:5 Friday 20 September 2024 21:35:28 -0400 (0:00:00.035) 0:00:00.035 ****** 18445 1726882528.09850: entering _queue_task() for managed_node1/gather_facts 18445 1726882528.09852: Creating lock for gather_facts 18445 1726882528.10192: worker is 1 (out of 1 available) 18445 1726882528.10218: exiting _queue_task() for managed_node1/gather_facts 18445 1726882528.10231: done queuing things up, now waiting for results queue to drain 18445 1726882528.10233: waiting for pending results... 18445 1726882528.11572: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882528.12195: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000007c 18445 1726882528.12261: variable 'ansible_search_path' from source: unknown 18445 1726882528.12327: calling self._execute() 18445 1726882528.12467: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882528.12531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882528.12578: variable 'omit' from source: magic vars 18445 1726882528.12697: variable 'omit' from source: magic vars 18445 1726882528.12841: variable 'omit' from source: magic vars 18445 1726882528.12892: variable 'omit' from source: magic vars 18445 1726882528.12943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18445 1726882528.12986: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18445 1726882528.13010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18445 1726882528.13035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882528.13052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882528.13097: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18445 1726882528.13109: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882528.13117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882528.13221: Set connection var ansible_shell_type to sh 18445 1726882528.13234: Set connection var ansible_module_compression to ZIP_DEFLATED 18445 1726882528.13250: Set connection var ansible_connection to ssh 18445 1726882528.13269: Set connection var ansible_pipelining to False 18445 1726882528.13283: Set connection var ansible_shell_executable to /bin/sh 18445 1726882528.13295: Set connection var ansible_timeout to 10 18445 1726882528.13396: variable 'ansible_shell_executable' from source: unknown 18445 1726882528.13406: variable 'ansible_connection' from source: unknown 18445 1726882528.13413: variable 'ansible_module_compression' from source: unknown 18445 1726882528.13420: variable 'ansible_shell_type' from source: unknown 18445 1726882528.13427: variable 'ansible_shell_executable' from source: unknown 18445 1726882528.13435: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882528.13443: variable 'ansible_pipelining' from source: unknown 18445 1726882528.13449: variable 'ansible_timeout' from source: unknown 18445 1726882528.13459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882528.13652: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 18445 1726882528.13673: variable 'omit' from source: magic vars 18445 1726882528.13684: starting attempt loop 18445 1726882528.13694: running the handler 18445 1726882528.13715: variable 'ansible_facts' from source: unknown 18445 1726882528.13736: _low_level_execute_command(): starting 18445 1726882528.13749: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18445 1726882528.14688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882528.14739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.14757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.14779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.14840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882528.14852: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882528.14872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.14890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882528.14902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882528.14945: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882528.14962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.14980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.14995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.15009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882528.15060: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882528.15066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.15143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882528.15168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882528.15184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882528.15350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882528.17021: stdout chunk (state=3): >>>/root <<< 18445 1726882528.17204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882528.17209: stdout chunk (state=3): >>><<< 18445 1726882528.17211: stderr chunk (state=3): >>><<< 18445 1726882528.17324: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882528.17328: _low_level_execute_command(): starting 18445 1726882528.17331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262 `" && echo ansible-tmp-1726882528.1723437-18473-61144104753262="` echo /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262 `" ) && sleep 0' 18445 1726882528.18414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882528.18440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.18466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.18486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.18536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882528.18554: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882528.18575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.18594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882528.18615: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882528.18632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882528.18651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.18673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.18694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.18714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882528.18729: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882528.18748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.18838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882528.18865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882528.18889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882528.19014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882528.20894: stdout chunk (state=3): >>>ansible-tmp-1726882528.1723437-18473-61144104753262=/root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262 <<< 18445 1726882528.21054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882528.21084: stderr chunk (state=3): >>><<< 18445 1726882528.21097: stdout chunk (state=3): >>><<< 18445 1726882528.21149: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882528.1723437-18473-61144104753262=/root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882528.21230: variable 'ansible_module_compression' from source: unknown 18445 1726882528.21275: ANSIBALLZ: Using generic lock for ansible.legacy.setup 18445 1726882528.21284: ANSIBALLZ: Acquiring lock 18445 1726882528.21306: ANSIBALLZ: Lock acquired: 140250087337296 18445 1726882528.21310: ANSIBALLZ: Creating module 18445 1726882528.69898: ANSIBALLZ: Writing module into payload 18445 1726882528.70080: ANSIBALLZ: Writing module 18445 1726882528.70250: ANSIBALLZ: Renaming module 18445 1726882528.70944: ANSIBALLZ: Done creating module 18445 1726882528.70988: variable 'ansible_facts' from source: unknown 18445 1726882528.71154: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18445 1726882528.71169: _low_level_execute_command(): starting 18445 1726882528.71183: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 18445 1726882528.73017: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882528.73869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.73882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.73918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.73969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 18445 1726882528.73973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.73976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.74090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 18445 1726882528.74093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.74283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882528.74287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882528.74358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882528.74517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882528.76185: stdout chunk (state=3): >>>PLATFORM <<< 18445 1726882528.76273: stdout chunk (state=3): >>>Linux <<< 18445 1726882528.76305: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 18445 1726882528.76509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882528.76513: stdout chunk (state=3): >>><<< 18445 1726882528.76532: stderr chunk (state=3): >>><<< 18445 1726882528.76569: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882528.76578 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 18445 1726882528.76695: _low_level_execute_command(): starting 18445 1726882528.76698: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 18445 1726882528.77273: Sending initial data 18445 1726882528.77276: Sent initial data (1181 bytes) 18445 1726882528.79644: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882528.79671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.79686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.79702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.79771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882528.79786: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882528.79799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.79815: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882528.79825: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882528.79835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882528.79886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.79900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882528.79915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.79927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882528.79944: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882528.79960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.80072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882528.80092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882528.80114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882528.80236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882528.83991: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 18445 1726882528.84576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882528.84579: stdout chunk (state=3): >>><<< 18445 1726882528.84581: stderr chunk (state=3): >>><<< 18445 1726882528.84584: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882528.84587: variable 'ansible_facts' from source: unknown 18445 1726882528.84589: variable 'ansible_facts' from source: unknown 18445 1726882528.84591: variable 'ansible_module_compression' from source: unknown 18445 1726882528.84669: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18445x1hycoyh/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18445 1726882528.84672: variable 'ansible_facts' from source: unknown 18445 1726882528.84843: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262/AnsiballZ_setup.py 18445 1726882528.85698: Sending initial data 18445 1726882528.85701: Sent initial data (153 bytes) 18445 1726882528.87774: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.87777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.87804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 18445 1726882528.87807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.87809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 18445 1726882528.87812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.87870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882528.88535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882528.88539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882528.88634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882528.90405: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18445 1726882528.90490: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 18445 1726882528.90590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18445x1hycoyh/tmp241hfn8u /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262/AnsiballZ_setup.py <<< 18445 1726882528.90685: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 18445 1726882528.93800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882528.94066: stderr chunk (state=3): >>><<< 18445 1726882528.94070: stdout chunk (state=3): >>><<< 18445 1726882528.94072: done transferring module to remote 18445 1726882528.94074: _low_level_execute_command(): starting 18445 1726882528.94076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262/ /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262/AnsiballZ_setup.py && sleep 0' 18445 1726882528.94630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.94633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.94672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 18445 1726882528.94675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.94677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.94745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882528.94749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882528.94856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882528.96616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882528.96680: stderr chunk (state=3): >>><<< 18445 1726882528.96683: stdout chunk (state=3): >>><<< 18445 1726882528.96783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882528.96787: _low_level_execute_command(): starting 18445 1726882528.96789: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262/AnsiballZ_setup.py && sleep 0' 18445 1726882528.97981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.97987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882528.98022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 18445 1726882528.98025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882528.98027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 18445 1726882528.98029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882528.98098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882528.98107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882528.98109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882528.98213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882529.00761: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 18445 1726882529.00767: stdout chunk (state=3): >>>import '_thread' # <<< 18445 1726882529.00770: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 18445 1726882529.00858: stdout chunk (state=3): >>>import '_io' # <<< 18445 1726882529.00861: stdout chunk (state=3): >>>import 'marshal' # <<< 18445 1726882529.00902: stdout chunk (state=3): >>>import 'posix' # <<< 18445 1726882529.00944: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 18445 1726882529.00947: stdout chunk (state=3): >>># installing zipimport hook <<< 18445 1726882529.00997: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 18445 1726882529.01000: stdout chunk (state=3): >>># installed zipimport hook <<< 18445 1726882529.01082: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 18445 1726882529.01085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882529.01123: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 18445 1726882529.01126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 18445 1726882529.01129: stdout chunk (state=3): >>>import '_codecs' # <<< 18445 1726882529.01156: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81442d8dc0> <<< 18445 1726882529.01202: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 18445 1726882529.01225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81442d8b20> <<< 18445 1726882529.01259: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 18445 1726882529.01262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 18445 1726882529.01275: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81442d8ac0> <<< 18445 1726882529.01302: stdout chunk (state=3): >>>import '_signal' # <<< 18445 1726882529.01337: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 18445 1726882529.01340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 18445 1726882529.01347: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d490> <<< 18445 1726882529.01379: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 18445 1726882529.01392: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 18445 1726882529.01421: stdout chunk (state=3): >>>import '_abc' # <<< 18445 1726882529.01424: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d940> <<< 18445 1726882529.01450: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d670> <<< 18445 1726882529.01493: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 18445 1726882529.01507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 18445 1726882529.01528: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 18445 1726882529.01551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 18445 1726882529.01580: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 18445 1726882529.01599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 18445 1726882529.01630: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144234190> <<< 18445 1726882529.01648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 18445 1726882529.01678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 18445 1726882529.01775: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144234220> <<< 18445 1726882529.01803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 18445 1726882529.01848: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144257850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144234940> <<< 18445 1726882529.01872: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144295880> <<< 18445 1726882529.01898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 18445 1726882529.01901: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814422dd90> <<< 18445 1726882529.01958: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 18445 1726882529.01962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144257d90> <<< 18445 1726882529.02009: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d970> <<< 18445 1726882529.02045: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18445 1726882529.02378: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 18445 1726882529.02419: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 18445 1726882529.02425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 18445 1726882529.02431: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 18445 1726882529.02445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 18445 1726882529.02487: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 18445 1726882529.02490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 18445 1726882529.02492: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143faeeb0> <<< 18445 1726882529.02535: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fb1f40> <<< 18445 1726882529.02569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 18445 1726882529.02573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 18445 1726882529.02621: stdout chunk (state=3): >>>import '_sre' # <<< 18445 1726882529.02625: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 18445 1726882529.02627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 18445 1726882529.02656: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 18445 1726882529.02662: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fa7610> <<< 18445 1726882529.02686: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fae370> <<< 18445 1726882529.02697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 18445 1726882529.02779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 18445 1726882529.02789: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 18445 1726882529.02819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882529.02843: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 18445 1726882529.02884: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.02889: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143e94dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e948b0> import 'itertools' # <<< 18445 1726882529.02922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 18445 1726882529.02925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e94eb0> <<< 18445 1726882529.02940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 18445 1726882529.02972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 18445 1726882529.03006: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e94f70> <<< 18445 1726882529.03012: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 18445 1726882529.03014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e94e80> <<< 18445 1726882529.03024: stdout chunk (state=3): >>>import '_collections' # <<< 18445 1726882529.03090: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f89d30> <<< 18445 1726882529.03094: stdout chunk (state=3): >>>import '_functools' # <<< 18445 1726882529.03106: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f82610> <<< 18445 1726882529.03169: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 18445 1726882529.03179: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f96670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fb5e20> <<< 18445 1726882529.03223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 18445 1726882529.03227: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143ea6c70> <<< 18445 1726882529.03232: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f89250> <<< 18445 1726882529.03252: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.03274: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143f96280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fbb9d0> <<< 18445 1726882529.03287: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 18445 1726882529.03319: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882529.03336: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 18445 1726882529.03356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 18445 1726882529.03373: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea6fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea6d90> <<< 18445 1726882529.03395: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 18445 1726882529.03425: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea6d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 18445 1726882529.03444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 18445 1726882529.03460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 18445 1726882529.03476: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 18445 1726882529.03534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 18445 1726882529.03556: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 18445 1726882529.03576: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e79370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 18445 1726882529.03589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 18445 1726882529.03614: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e79460> <<< 18445 1726882529.03749: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143eaefa0> <<< 18445 1726882529.03776: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea8a30> <<< 18445 1726882529.03790: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea8490> <<< 18445 1726882529.03810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 18445 1726882529.03821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 18445 1726882529.03842: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 18445 1726882529.03876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 18445 1726882529.03889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dad1c0> <<< 18445 1726882529.03923: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e64c70> <<< 18445 1726882529.03979: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea8eb0> <<< 18445 1726882529.03993: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fbb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 18445 1726882529.04023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 18445 1726882529.04046: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 18445 1726882529.04071: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dbfaf0> import 'errno' # <<< 18445 1726882529.04101: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143dbfe20> <<< 18445 1726882529.04119: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 18445 1726882529.04139: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 18445 1726882529.04151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dd1730> <<< 18445 1726882529.04192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 18445 1726882529.04208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 18445 1726882529.04279: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dd1c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d6a3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dbff10> <<< 18445 1726882529.04300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 18445 1726882529.04346: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d7a280> <<< 18445 1726882529.04362: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dd15b0> import 'pwd' # <<< 18445 1726882529.04391: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d7a340> <<< 18445 1726882529.04446: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea69d0> <<< 18445 1726882529.04462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 18445 1726882529.04485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 18445 1726882529.04496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 18445 1726882529.04545: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d966a0> <<< 18445 1726882529.04569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 18445 1726882529.04589: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d96970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d96760> <<< 18445 1726882529.04609: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d96850> <<< 18445 1726882529.04647: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 18445 1726882529.04852: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d96ca0> <<< 18445 1726882529.04887: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143da21f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d968e0> <<< 18445 1726882529.04902: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d89a30> <<< 18445 1726882529.04921: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea65b0> <<< 18445 1726882529.04948: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 18445 1726882529.04998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 18445 1726882529.05390: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d96a90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8143cbf670> <<< 18445 1726882529.05784: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 18445 1726882529.06056: stdout chunk (state=3): >>># zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 18445 1726882529.07869: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.08730: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b537f0> <<< 18445 1726882529.08785: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 18445 1726882529.08800: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 18445 1726882529.08825: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143be4760> <<< 18445 1726882529.08871: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4640> <<< 18445 1726882529.08897: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4370> <<< 18445 1726882529.08922: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 18445 1726882529.08974: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4190> import 'atexit' # <<< 18445 1726882529.09005: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143be4400> <<< 18445 1726882529.09018: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 18445 1726882529.09044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 18445 1726882529.09092: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be47c0> <<< 18445 1726882529.09117: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 18445 1726882529.09128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 18445 1726882529.09154: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 18445 1726882529.09168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 18445 1726882529.09185: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 18445 1726882529.09269: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bbd7c0> <<< 18445 1726882529.09316: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.09344: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143bbdb50> <<< 18445 1726882529.09360: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143bbd9a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 18445 1726882529.09365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 18445 1726882529.09387: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435874f0> <<< 18445 1726882529.09410: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bddd30> <<< 18445 1726882529.09584: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4520> <<< 18445 1726882529.09601: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 18445 1726882529.09633: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bdd190> <<< 18445 1726882529.09656: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 18445 1726882529.09692: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 18445 1726882529.09718: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 18445 1726882529.09736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 18445 1726882529.09739: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c0ea90> <<< 18445 1726882529.09812: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bb1190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bb1790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814358cd00> <<< 18445 1726882529.09865: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.09869: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143bb16a0> <<< 18445 1726882529.09896: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c32d30> <<< 18445 1726882529.09914: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 18445 1726882529.09927: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 18445 1726882529.09964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 18445 1726882529.10037: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.10058: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435e59a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c3de50> <<< 18445 1726882529.10062: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 18445 1726882529.10124: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435f50d0> <<< 18445 1726882529.10127: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c3de20> <<< 18445 1726882529.10140: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 18445 1726882529.10178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882529.10205: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 18445 1726882529.10262: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c44220> <<< 18445 1726882529.10393: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435f5100> <<< 18445 1726882529.10480: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143c08b80> <<< 18445 1726882529.10508: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143c3dac0> <<< 18445 1726882529.10560: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143c3dd00> <<< 18445 1726882529.10572: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143cbf820> <<< 18445 1726882529.10590: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 18445 1726882529.10609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 18445 1726882529.10660: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.10665: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435f10d0> <<< 18445 1726882529.10849: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.10867: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435e7370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435f1d00> <<< 18445 1726882529.10914: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435f16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435f2130> # zipimport: zlib available # zipimport: zlib available <<< 18445 1726882529.10920: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 18445 1726882529.10932: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.10995: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.11081: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18445 1726882529.11123: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 18445 1726882529.11126: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.11224: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.11322: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.11753: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.12221: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 18445 1726882529.12234: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 18445 1726882529.12241: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882529.12299: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143b7c8b0> <<< 18445 1726882529.12367: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 18445 1726882529.12379: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b81910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431a46a0> <<< 18445 1726882529.12426: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 18445 1726882529.12433: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.12455: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.12467: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 18445 1726882529.12473: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.12592: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.12720: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 18445 1726882529.12741: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bbb7f0> # zipimport: zlib available <<< 18445 1726882529.13134: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13494: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13548: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13610: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 18445 1726882529.13616: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13645: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13703: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 18445 1726882529.13741: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13827: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 18445 1726882529.13843: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 18445 1726882529.13849: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13874: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.13916: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 18445 1726882529.13921: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14103: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 18445 1726882529.14319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 18445 1726882529.14325: stdout chunk (state=3): >>>import '_ast' # <<< 18445 1726882529.14395: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431a9d90> # zipimport: zlib available <<< 18445 1726882529.14458: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14534: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 18445 1726882529.14543: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 18445 1726882529.14559: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14595: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14632: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 18445 1726882529.14640: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14675: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14715: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14806: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.14870: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 18445 1726882529.14886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882529.14961: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143b6f0a0> <<< 18445 1726882529.15050: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143171070> <<< 18445 1726882529.15085: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 18445 1726882529.15092: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 18445 1726882529.15144: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15195: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15219: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15259: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 18445 1726882529.15269: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 18445 1726882529.15284: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 18445 1726882529.15323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 18445 1726882529.15338: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 18445 1726882529.15367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 18445 1726882529.15447: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b78160> <<< 18445 1726882529.15486: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b74cd0> <<< 18445 1726882529.15557: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431a9bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 18445 1726882529.15611: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 18445 1726882529.15693: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 18445 1726882529.15707: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 18445 1726882529.15827: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18445 1726882529.15833: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15849: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15888: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15926: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15951: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.15988: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 18445 1726882529.15995: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.16061: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.16126: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.16138: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.16176: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 18445 1726882529.16326: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.16466: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.16498: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.16546: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882529.16568: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 18445 1726882529.16604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 18445 1726882529.16607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 18445 1726882529.16639: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142f24a60> <<< 18445 1726882529.16672: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 18445 1726882529.16675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 18445 1726882529.16686: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 18445 1726882529.16719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 18445 1726882529.16750: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 18445 1726882529.16754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431836d0> <<< 18445 1726882529.16799: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143183af0> <<< 18445 1726882529.16866: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814316a250> <<< 18445 1726882529.16881: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814316aa30> <<< 18445 1726882529.16906: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b9460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b9910> <<< 18445 1726882529.16926: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 18445 1726882529.16957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 18445 1726882529.16974: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 18445 1726882529.17014: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81431b6d00> <<< 18445 1726882529.17040: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b6d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 18445 1726882529.17067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 18445 1726882529.17095: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b6250> <<< 18445 1726882529.17105: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 18445 1726882529.17116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 18445 1726882529.17149: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142f8cf70> <<< 18445 1726882529.17186: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431ca4c0> <<< 18445 1726882529.17206: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b9310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 18445 1726882529.17247: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18445 1726882529.17251: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 18445 1726882529.17266: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17313: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17371: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 18445 1726882529.17375: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17408: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17466: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 18445 1726882529.17495: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 18445 1726882529.17505: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17516: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17552: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 18445 1726882529.17555: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17600: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17648: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 18445 1726882529.17651: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17683: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17727: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 18445 1726882529.17784: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17835: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17882: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.17941: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 18445 1726882529.17944: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 18445 1726882529.18342: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.18708: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 18445 1726882529.18750: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.18801: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.18857: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.18874: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 18445 1726882529.18897: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.18937: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 18445 1726882529.18940: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.18981: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19038: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 18445 1726882529.19073: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19093: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 18445 1726882529.19119: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19150: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 18445 1726882529.19218: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19296: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 18445 1726882529.19306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 18445 1726882529.19325: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142ea6ca0> <<< 18445 1726882529.19337: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 18445 1726882529.19358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 18445 1726882529.19534: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142ea6fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 18445 1726882529.19579: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19643: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 18445 1726882529.19722: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19797: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 18445 1726882529.19817: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19858: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.19931: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 18445 1726882529.19969: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20007: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 18445 1726882529.20031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 18445 1726882529.20182: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142e99370> <<< 18445 1726882529.20430: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142ee7bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 18445 1726882529.20434: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20478: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20529: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 18445 1726882529.20533: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20600: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20798: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20802: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20900: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 18445 1726882529.20931: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.20980: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 18445 1726882529.20984: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21011: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21059: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 18445 1726882529.21113: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882529.21152: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142e1f160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e1f2b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 18445 1726882529.21157: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21187: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21234: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 18445 1726882529.21237: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21362: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21489: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 18445 1726882529.21576: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21651: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21688: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21732: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 18445 1726882529.21735: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 18445 1726882529.21828: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21840: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.21956: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.22079: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 18445 1726882529.22187: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.22294: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 18445 1726882529.22297: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.22318: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.22348: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.22789: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23208: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 18445 1726882529.23211: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23294: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23384: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 18445 1726882529.23472: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23560: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 18445 1726882529.23565: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23683: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23817: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 18445 1726882529.23844: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 18445 1726882529.23848: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23877: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.23929: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 18445 1726882529.23932: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24010: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24090: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24257: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24435: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 18445 1726882529.24439: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available <<< 18445 1726882529.24468: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24494: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 18445 1726882529.24521: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24524: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24559: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 18445 1726882529.24562: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24613: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24680: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 18445 1726882529.24702: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24733: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 18445 1726882529.24736: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24783: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24836: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 18445 1726882529.24887: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.24944: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 18445 1726882529.24947: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25156: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25371: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 18445 1726882529.25419: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25480: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 18445 1726882529.25483: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25507: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25539: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 18445 1726882529.25574: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25607: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 18445 1726882529.25610: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25633: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25672: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 18445 1726882529.25739: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25813: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 18445 1726882529.25844: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 18445 1726882529.25847: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25884: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25936: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 18445 1726882529.25939: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25958: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.25974: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26009: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26053: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26106: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26187: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 18445 1726882529.26190: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26229: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26285: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 18445 1726882529.26288: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26440: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26600: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 18445 1726882529.26639: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26691: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 18445 1726882529.26694: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26727: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26777: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 18445 1726882529.26840: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.26924: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 18445 1726882529.26928: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 18445 1726882529.26997: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.27079: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 18445 1726882529.27082: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 18445 1726882529.27155: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882529.27331: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 18445 1726882529.27350: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 18445 1726882529.27398: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142e72070> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e72c40> <<< 18445 1726882529.27448: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142c88b80> <<< 18445 1726882529.28928: stdout chunk (state=3): >>>import 'gc' # <<< 18445 1726882529.36049: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 18445 1726882529.36103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e725e0> <<< 18445 1726882529.36428: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e1f520> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142c802b0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e1af10> <<< 18445 1726882529.36750: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 18445 1726882529.57043: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1X<<< 18445 1726882529.57084: stdout chunk (state=3): >>>O02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "29", "epoch": "1726882529", "epoch_int": "1726882529", "date": "2024-09-20", "time": "21:35:29", "iso8601_micro": "2024-09-21T01:35:29.296885Z", "iso8601": "2024-09-21T01:35:29Z", "iso8601_basic": "20240920T213529296885", "iso8601_basic_short": "20240920T213529", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.4, "15m": 0.21}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "et<<< 18445 1726882529.57154: stdout chunk (state=3): >>>h0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [<<< 18445 1726882529.57193: stdout chunk (state=3): >>>fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 687, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239304704, "block_size": 4096, "block_total": 65519355, "block_available": 64511549, "block_used": 1007806, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18445 1726882529.57859: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 18445 1726882529.58226: stdout chunk (state=3): >>># clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 18445 1726882529.58426: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale <<< 18445 1726882529.58438: stdout chunk (state=3): >>># cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types<<< 18445 1726882529.58478: stdout chunk (state=3): >>> # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator<<< 18445 1726882529.58490: stdout chunk (state=3): >>> # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii<<< 18445 1726882529.58509: stdout chunk (state=3): >>> # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno<<< 18445 1726882529.58532: stdout chunk (state=3): >>> # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random<<< 18445 1726882529.58554: stdout chunk (state=3): >>> # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__<<< 18445 1726882529.58581: stdout chunk (state=3): >>> # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform<<< 18445 1726882529.58604: stdout chunk (state=3): >>> # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 18445 1726882529.58628: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 18445 1726882529.58652: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections<<< 18445 1726882529.58682: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters<<< 18445 1726882529.58704: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process<<< 18445 1726882529.58731: stdout chunk (state=3): >>> # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle <<< 18445 1726882529.58757: stdout chunk (state=3): >>># cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter<<< 18445 1726882529.58785: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob<<< 18445 1726882529.58808: stdout chunk (state=3): >>> # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass<<< 18445 1726882529.58832: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix <<< 18445 1726882529.58866: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd<<< 18445 1726882529.58907: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution <<< 18445 1726882529.58929: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix<<< 18445 1726882529.58956: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd<<< 18445 1726882529.58982: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection<<< 18445 1726882529.59068: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.dummy <<< 18445 1726882529.59415: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 18445 1726882529.59456: stdout chunk (state=3): >>> <<< 18445 1726882529.59484: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 18445 1726882529.59528: stdout chunk (state=3): >>># destroy zipimport <<< 18445 1726882529.59557: stdout chunk (state=3): >>># destroy _compression<<< 18445 1726882529.59617: stdout chunk (state=3): >>> # destroy binascii # destroy importlib <<< 18445 1726882529.59635: stdout chunk (state=3): >>># destroy bz2 # destroy lzma <<< 18445 1726882529.59714: stdout chunk (state=3): >>># destroy __main__ # destroy locale <<< 18445 1726882529.59761: stdout chunk (state=3): >>># destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder <<< 18445 1726882529.59805: stdout chunk (state=3): >>># destroy json.encoder # destroy json.scanner <<< 18445 1726882529.59820: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 18445 1726882529.59868: stdout chunk (state=3): >>># destroy syslog<<< 18445 1726882529.59884: stdout chunk (state=3): >>> # destroy uuid <<< 18445 1726882529.59971: stdout chunk (state=3): >>># destroy selinux # destroy distro<<< 18445 1726882529.59987: stdout chunk (state=3): >>> # destroy logging # destroy argparse <<< 18445 1726882529.60061: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 18445 1726882529.60128: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy <<< 18445 1726882529.60170: stdout chunk (state=3): >>># destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 18445 1726882529.60204: stdout chunk (state=3): >>># destroy queue <<< 18445 1726882529.60231: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 18445 1726882529.60269: stdout chunk (state=3): >>># destroy shlex<<< 18445 1726882529.60297: stdout chunk (state=3): >>> # destroy datetime <<< 18445 1726882529.60323: stdout chunk (state=3): >>># destroy base64 <<< 18445 1726882529.60353: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 18445 1726882529.60392: stdout chunk (state=3): >>> # destroy getpass <<< 18445 1726882529.60406: stdout chunk (state=3): >>># destroy json <<< 18445 1726882529.60447: stdout chunk (state=3): >>># destroy socket # destroy struct<<< 18445 1726882529.60489: stdout chunk (state=3): >>> # destroy glob<<< 18445 1726882529.60531: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 18445 1726882529.60591: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context <<< 18445 1726882529.60629: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy multiprocessing.util # destroy array <<< 18445 1726882529.60641: stdout chunk (state=3): >>># destroy multiprocessing.dummy.connection <<< 18445 1726882529.60725: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna<<< 18445 1726882529.60796: stdout chunk (state=3): >>> # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl<<< 18445 1726882529.60847: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 18445 1726882529.60925: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves<<< 18445 1726882529.60989: stdout chunk (state=3): >>> # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 18445 1726882529.61054: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid<<< 18445 1726882529.61102: stdout chunk (state=3): >>> # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 18445 1726882529.61163: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess <<< 18445 1726882529.61222: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl <<< 18445 1726882529.61287: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib<<< 18445 1726882529.61331: stdout chunk (state=3): >>> # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 18445 1726882529.61399: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd <<< 18445 1726882529.61453: stdout chunk (state=3): >>># cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno<<< 18445 1726882529.61500: stdout chunk (state=3): >>> # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 18445 1726882529.61543: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap <<< 18445 1726882529.61586: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum <<< 18445 1726882529.61618: stdout chunk (state=3): >>># destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc<<< 18445 1726882529.61644: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse<<< 18445 1726882529.61692: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale<<< 18445 1726882529.61724: stdout chunk (state=3): >>> # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat<<< 18445 1726882529.61744: stdout chunk (state=3): >>> # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal<<< 18445 1726882529.61774: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 18445 1726882529.61804: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 18445 1726882529.61823: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 18445 1726882529.61863: stdout chunk (state=3): >>> # destroy gc <<< 18445 1726882529.61906: stdout chunk (state=3): >>># destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing<<< 18445 1726882529.61938: stdout chunk (state=3): >>> # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma<<< 18445 1726882529.62167: stdout chunk (state=3): >>> # destroy zlib # destroy _signal <<< 18445 1726882529.62211: stdout chunk (state=3): >>># destroy platform<<< 18445 1726882529.62269: stdout chunk (state=3): >>> # destroy _uuid # destroy _sre # destroy sre_parse <<< 18445 1726882529.62295: stdout chunk (state=3): >>># destroy tokenize <<< 18445 1726882529.62320: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath<<< 18445 1726882529.62392: stdout chunk (state=3): >>> # destroy stat <<< 18445 1726882529.62453: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd <<< 18445 1726882529.62492: stdout chunk (state=3): >>># destroy grp # destroy _posixsubprocess # destroy selectors <<< 18445 1726882529.62541: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request<<< 18445 1726882529.62618: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator<<< 18445 1726882529.62644: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves # destroy _operator <<< 18445 1726882529.62717: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp <<< 18445 1726882529.62729: stdout chunk (state=3): >>># destroy io # destroy marshal <<< 18445 1726882529.62809: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks<<< 18445 1726882529.62826: stdout chunk (state=3): >>> <<< 18445 1726882529.63397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 18445 1726882529.63407: stdout chunk (state=3): >>><<< 18445 1726882529.63419: stderr chunk (state=3): >>><<< 18445 1726882529.63593: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81442d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81442d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81442d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144234190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144234220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144257850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144234940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144295880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814422dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8144257d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814427d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143faeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fb1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fa7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143e94dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e948b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e94eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e94f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e94e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f89d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f82610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f96670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fb5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143ea6c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143f89250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143f96280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fbb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea6fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea6d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea6d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e79370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e79460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143eaefa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea8a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea8490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dad1c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143e64c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea8eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143fbb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dbfaf0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143dbfe20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dd1730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dd1c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d6a3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dbff10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d7a280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143dd15b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d7a340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea69d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d966a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d96970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d96760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d96850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143d96ca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143da21f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d968e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d89a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143ea65b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143d96a90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8143cbf670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b537f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143be4760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143be4400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be47c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bbd7c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143bbdb50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143bbd9a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435874f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bddd30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143be4520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bdd190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c0ea90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bb1190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bb1790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814358cd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143bb16a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c32d30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435e59a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c3de50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435f50d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c3de20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143c44220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435f5100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143c08b80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143c3dac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143c3dd00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143cbf820> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435f10d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435e7370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435f1d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81435f16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81435f2130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143b7c8b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b81910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431a46a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143bbb7f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431a9d90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143b6f0a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143171070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b78160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8143b74cd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431a9bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142f24a60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431836d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8143183af0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814316a250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f814316aa30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b9460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b9910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f81431b6d00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b6d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b6250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142f8cf70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431ca4c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f81431b9310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142ea6ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142ea6fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142e99370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142ee7bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142e1f160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e1f2b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_w5trxmab/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8142e72070> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e72c40> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142c88b80> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e725e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e1f520> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142c802b0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8142e1af10> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "29", "epoch": "1726882529", "epoch_int": "1726882529", "date": "2024-09-20", "time": "21:35:29", "iso8601_micro": "2024-09-21T01:35:29.296885Z", "iso8601": "2024-09-21T01:35:29Z", "iso8601_basic": "20240920T213529296885", "iso8601_basic_short": "20240920T213529", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5, "5m": 0.4, "15m": 0.21}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 687, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239304704, "block_size": 4096, "block_total": 65519355, "block_available": 64511549, "block_used": 1007806, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 18445 1726882529.64862: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18445 1726882529.64894: _low_level_execute_command(): starting 18445 1726882529.64904: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882528.1723437-18473-61144104753262/ > /dev/null 2>&1 && sleep 0' 18445 1726882529.65626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882529.65640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.65655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.65680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.65736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.65749: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882529.65765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.65784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882529.65799: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882529.65815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882529.65834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.65848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.65866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.65879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.65891: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882529.65909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.65995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882529.66020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882529.66047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882529.66184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882529.68816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882529.68907: stderr chunk (state=3): >>><<< 18445 1726882529.68912: stdout chunk (state=3): >>><<< 18445 1726882529.68930: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882529.68939: handler run complete 18445 1726882529.69053: variable 'ansible_facts' from source: unknown 18445 1726882529.69175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882529.70446: variable 'ansible_facts' from source: unknown 18445 1726882529.70542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882529.70644: attempt loop complete, returning result 18445 1726882529.70648: _execute() done 18445 1726882529.70650: dumping result to json 18445 1726882529.70673: done dumping result, returning 18445 1726882529.70681: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-00000000007c] 18445 1726882529.70685: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000007c 18445 1726882529.70945: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000007c 18445 1726882529.70948: WORKER PROCESS EXITING ok: [managed_node1] 18445 1726882529.71177: no more pending results, returning what we have 18445 1726882529.71179: results queue empty 18445 1726882529.71179: checking for any_errors_fatal 18445 1726882529.71180: done checking for any_errors_fatal 18445 1726882529.71181: checking for max_fail_percentage 18445 1726882529.71182: done checking for max_fail_percentage 18445 1726882529.71182: checking to see if all hosts have failed and the running result is not ok 18445 1726882529.71183: done checking to see if all hosts have failed 18445 1726882529.71183: getting the remaining hosts for this loop 18445 1726882529.71184: done getting the remaining hosts for this loop 18445 1726882529.71187: getting the next task for host managed_node1 18445 1726882529.71191: done getting next task for host managed_node1 18445 1726882529.71192: ^ task is: TASK: meta (flush_handlers) 18445 1726882529.71193: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882529.71196: getting variables 18445 1726882529.71197: in VariableManager get_vars() 18445 1726882529.71223: Calling all_inventory to load vars for managed_node1 18445 1726882529.71229: Calling groups_inventory to load vars for managed_node1 18445 1726882529.71232: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882529.71239: Calling all_plugins_play to load vars for managed_node1 18445 1726882529.71241: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882529.71243: Calling groups_plugins_play to load vars for managed_node1 18445 1726882529.71356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882529.71475: done with get_vars() 18445 1726882529.71484: done getting variables 18445 1726882529.71529: in VariableManager get_vars() 18445 1726882529.71536: Calling all_inventory to load vars for managed_node1 18445 1726882529.71537: Calling groups_inventory to load vars for managed_node1 18445 1726882529.71539: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882529.71542: Calling all_plugins_play to load vars for managed_node1 18445 1726882529.71543: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882529.71547: Calling groups_plugins_play to load vars for managed_node1 18445 1726882529.71736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882529.71847: done with get_vars() 18445 1726882529.71858: done queuing things up, now waiting for results queue to drain 18445 1726882529.71860: results queue empty 18445 1726882529.71860: checking for any_errors_fatal 18445 1726882529.71862: done checking for any_errors_fatal 18445 1726882529.71862: checking for max_fail_percentage 18445 1726882529.71865: done checking for max_fail_percentage 18445 1726882529.71865: checking to see if all hosts have failed and the running result is not ok 18445 1726882529.71866: done checking to see if all hosts have failed 18445 1726882529.71866: getting the remaining hosts for this loop 18445 1726882529.71867: done getting the remaining hosts for this loop 18445 1726882529.71869: getting the next task for host managed_node1 18445 1726882529.71871: done getting next task for host managed_node1 18445 1726882529.71873: ^ task is: TASK: Include the task 'el_repo_setup.yml' 18445 1726882529.71874: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882529.71876: getting variables 18445 1726882529.71877: in VariableManager get_vars() 18445 1726882529.71882: Calling all_inventory to load vars for managed_node1 18445 1726882529.71884: Calling groups_inventory to load vars for managed_node1 18445 1726882529.71885: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882529.71889: Calling all_plugins_play to load vars for managed_node1 18445 1726882529.71891: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882529.71892: Calling groups_plugins_play to load vars for managed_node1 18445 1726882529.71972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882529.72080: done with get_vars() 18445 1726882529.72085: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:10 Friday 20 September 2024 21:35:29 -0400 (0:00:01.622) 0:00:01.658 ****** 18445 1726882529.72138: entering _queue_task() for managed_node1/include_tasks 18445 1726882529.72140: Creating lock for include_tasks 18445 1726882529.72340: worker is 1 (out of 1 available) 18445 1726882529.72352: exiting _queue_task() for managed_node1/include_tasks 18445 1726882529.72371: done queuing things up, now waiting for results queue to drain 18445 1726882529.72373: waiting for pending results... 18445 1726882529.72574: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 18445 1726882529.72686: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000006 18445 1726882529.72718: variable 'ansible_search_path' from source: unknown 18445 1726882529.72757: calling self._execute() 18445 1726882529.72846: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882529.72860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882529.72880: variable 'omit' from source: magic vars 18445 1726882529.72986: _execute() done 18445 1726882529.72993: dumping result to json 18445 1726882529.72999: done dumping result, returning 18445 1726882529.73008: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-f6eb-935c-000000000006] 18445 1726882529.73018: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000006 18445 1726882529.73184: no more pending results, returning what we have 18445 1726882529.73189: in VariableManager get_vars() 18445 1726882529.73220: Calling all_inventory to load vars for managed_node1 18445 1726882529.73223: Calling groups_inventory to load vars for managed_node1 18445 1726882529.73226: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882529.73239: Calling all_plugins_play to load vars for managed_node1 18445 1726882529.73242: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882529.73247: Calling groups_plugins_play to load vars for managed_node1 18445 1726882529.73474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882529.73670: done with get_vars() 18445 1726882529.73681: variable 'ansible_search_path' from source: unknown 18445 1726882529.73695: we have included files to process 18445 1726882529.73696: generating all_blocks data 18445 1726882529.73698: done generating all_blocks data 18445 1726882529.73699: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18445 1726882529.73700: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18445 1726882529.73702: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18445 1726882529.74173: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000006 18445 1726882529.74176: WORKER PROCESS EXITING 18445 1726882529.74649: in VariableManager get_vars() 18445 1726882529.74673: done with get_vars() 18445 1726882529.74685: done processing included file 18445 1726882529.74687: iterating over new_blocks loaded from include file 18445 1726882529.74688: in VariableManager get_vars() 18445 1726882529.74697: done with get_vars() 18445 1726882529.74698: filtering new block on tags 18445 1726882529.74711: done filtering new block on tags 18445 1726882529.74714: in VariableManager get_vars() 18445 1726882529.74723: done with get_vars() 18445 1726882529.74724: filtering new block on tags 18445 1726882529.74738: done filtering new block on tags 18445 1726882529.74764: in VariableManager get_vars() 18445 1726882529.74776: done with get_vars() 18445 1726882529.74777: filtering new block on tags 18445 1726882529.74791: done filtering new block on tags 18445 1726882529.74793: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 18445 1726882529.74797: extending task lists for all hosts with included blocks 18445 1726882529.74842: done extending task lists 18445 1726882529.74843: done processing included files 18445 1726882529.74844: results queue empty 18445 1726882529.74844: checking for any_errors_fatal 18445 1726882529.74845: done checking for any_errors_fatal 18445 1726882529.74846: checking for max_fail_percentage 18445 1726882529.74847: done checking for max_fail_percentage 18445 1726882529.74848: checking to see if all hosts have failed and the running result is not ok 18445 1726882529.74849: done checking to see if all hosts have failed 18445 1726882529.74849: getting the remaining hosts for this loop 18445 1726882529.74850: done getting the remaining hosts for this loop 18445 1726882529.74853: getting the next task for host managed_node1 18445 1726882529.74858: done getting next task for host managed_node1 18445 1726882529.74861: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 18445 1726882529.74863: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882529.74981: getting variables 18445 1726882529.74982: in VariableManager get_vars() 18445 1726882529.74990: Calling all_inventory to load vars for managed_node1 18445 1726882529.74992: Calling groups_inventory to load vars for managed_node1 18445 1726882529.74995: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882529.75000: Calling all_plugins_play to load vars for managed_node1 18445 1726882529.75002: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882529.75005: Calling groups_plugins_play to load vars for managed_node1 18445 1726882529.75257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882529.75461: done with get_vars() 18445 1726882529.75474: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:35:29 -0400 (0:00:00.034) 0:00:01.692 ****** 18445 1726882529.75545: entering _queue_task() for managed_node1/setup 18445 1726882529.75834: worker is 1 (out of 1 available) 18445 1726882529.75851: exiting _queue_task() for managed_node1/setup 18445 1726882529.75861: done queuing things up, now waiting for results queue to drain 18445 1726882529.75862: waiting for pending results... 18445 1726882529.76012: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 18445 1726882529.76077: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000008d 18445 1726882529.76086: variable 'ansible_search_path' from source: unknown 18445 1726882529.76090: variable 'ansible_search_path' from source: unknown 18445 1726882529.76116: calling self._execute() 18445 1726882529.76166: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882529.76174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882529.76184: variable 'omit' from source: magic vars 18445 1726882529.76538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882529.78721: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882529.78807: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882529.78848: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882529.78888: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882529.78931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882529.79020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882529.79083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882529.79115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882529.79201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882529.79223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882529.79442: variable 'ansible_facts' from source: unknown 18445 1726882529.79498: variable 'network_test_required_facts' from source: task vars 18445 1726882529.79534: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 18445 1726882529.79539: variable 'omit' from source: magic vars 18445 1726882529.79564: variable 'omit' from source: magic vars 18445 1726882529.79594: variable 'omit' from source: magic vars 18445 1726882529.79611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18445 1726882529.79630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18445 1726882529.79643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18445 1726882529.79658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882529.79666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882529.79696: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18445 1726882529.79700: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882529.79702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882529.79760: Set connection var ansible_shell_type to sh 18445 1726882529.79768: Set connection var ansible_module_compression to ZIP_DEFLATED 18445 1726882529.79774: Set connection var ansible_connection to ssh 18445 1726882529.79780: Set connection var ansible_pipelining to False 18445 1726882529.79788: Set connection var ansible_shell_executable to /bin/sh 18445 1726882529.79800: Set connection var ansible_timeout to 10 18445 1726882529.79815: variable 'ansible_shell_executable' from source: unknown 18445 1726882529.79818: variable 'ansible_connection' from source: unknown 18445 1726882529.79820: variable 'ansible_module_compression' from source: unknown 18445 1726882529.79823: variable 'ansible_shell_type' from source: unknown 18445 1726882529.79825: variable 'ansible_shell_executable' from source: unknown 18445 1726882529.79827: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882529.79831: variable 'ansible_pipelining' from source: unknown 18445 1726882529.79833: variable 'ansible_timeout' from source: unknown 18445 1726882529.79837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882529.79941: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 18445 1726882529.79949: variable 'omit' from source: magic vars 18445 1726882529.79953: starting attempt loop 18445 1726882529.79959: running the handler 18445 1726882529.79971: _low_level_execute_command(): starting 18445 1726882529.79977: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18445 1726882529.80448: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.80470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.80484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.80494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.80534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882529.80554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882529.80651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18445 1726882529.82462: stdout chunk (state=3): >>>/root <<< 18445 1726882529.82636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882529.82646: stdout chunk (state=3): >>><<< 18445 1726882529.82661: stderr chunk (state=3): >>><<< 18445 1726882529.82695: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18445 1726882529.82719: _low_level_execute_command(): starting 18445 1726882529.82728: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639 `" && echo ansible-tmp-1726882529.8270774-18532-206561677547639="` echo /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639 `" ) && sleep 0' 18445 1726882529.83405: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882529.83418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.83441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.83466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.83507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.83518: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882529.83530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.83559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882529.83574: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882529.83584: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882529.83594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.83606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.83620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.83630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.83640: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882529.83652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.83742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882529.83772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882529.83795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882529.83922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882529.85808: stdout chunk (state=3): >>>ansible-tmp-1726882529.8270774-18532-206561677547639=/root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639 <<< 18445 1726882529.85922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882529.86012: stderr chunk (state=3): >>><<< 18445 1726882529.86029: stdout chunk (state=3): >>><<< 18445 1726882529.86068: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882529.8270774-18532-206561677547639=/root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882529.86268: variable 'ansible_module_compression' from source: unknown 18445 1726882529.86271: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18445x1hycoyh/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18445 1726882529.86273: variable 'ansible_facts' from source: unknown 18445 1726882529.86367: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639/AnsiballZ_setup.py 18445 1726882529.86538: Sending initial data 18445 1726882529.86541: Sent initial data (154 bytes) 18445 1726882529.87539: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882529.87553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.87573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.87602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.87644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.87659: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882529.87676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.87693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882529.87716: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882529.87728: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882529.87739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.87752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.87773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.87785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.87795: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882529.87808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.87898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882529.87922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882529.87947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882529.88078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882529.89837: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18445 1726882529.89926: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 18445 1726882529.90027: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18445x1hycoyh/tmpm5gtduvb /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639/AnsiballZ_setup.py <<< 18445 1726882529.90112: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 18445 1726882529.92887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882529.93071: stderr chunk (state=3): >>><<< 18445 1726882529.93075: stdout chunk (state=3): >>><<< 18445 1726882529.93077: done transferring module to remote 18445 1726882529.93079: _low_level_execute_command(): starting 18445 1726882529.93081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639/ /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639/AnsiballZ_setup.py && sleep 0' 18445 1726882529.93718: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882529.93748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.93772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.93803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.93844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.93883: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882529.93901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.93919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882529.93934: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882529.93947: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882529.93960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.93976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.94000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.94012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.94022: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882529.94048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.94133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882529.94154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882529.94172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882529.94296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18445 1726882529.96944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882529.97168: stderr chunk (state=3): >>><<< 18445 1726882529.97172: stdout chunk (state=3): >>><<< 18445 1726882529.97175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18445 1726882529.97177: _low_level_execute_command(): starting 18445 1726882529.97180: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639/AnsiballZ_setup.py && sleep 0' 18445 1726882529.97703: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882529.97706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.97709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.97711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.97720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.97728: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882529.97737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.97831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882529.97834: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882529.97836: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882529.97838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882529.97840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882529.97842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882529.97844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882529.97846: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882529.97848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882529.97889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882529.97908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882529.97919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882529.98054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18445 1726882530.00375: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 18445 1726882530.00402: stdout chunk (state=3): >>>import '_thread' # <<< 18445 1726882530.00409: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 18445 1726882530.00489: stdout chunk (state=3): >>>import '_io' # <<< 18445 1726882530.00497: stdout chunk (state=3): >>>import 'marshal' # <<< 18445 1726882530.00540: stdout chunk (state=3): >>>import 'posix' # <<< 18445 1726882530.00578: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 18445 1726882530.00583: stdout chunk (state=3): >>># installing zipimport hook <<< 18445 1726882530.00638: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 18445 1726882530.00641: stdout chunk (state=3): >>># installed zipimport hook <<< 18445 1726882530.00721: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 18445 1726882530.00727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882530.00741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 18445 1726882530.00770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 18445 1726882530.00773: stdout chunk (state=3): >>>import '_codecs' # <<< 18445 1726882530.00802: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387173dc0> <<< 18445 1726882530.00836: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 18445 1726882530.00865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3871183a0> <<< 18445 1726882530.00878: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387173b20> <<< 18445 1726882530.00904: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 18445 1726882530.00929: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387173ac0> <<< 18445 1726882530.00950: stdout chunk (state=3): >>>import '_signal' # <<< 18445 1726882530.00975: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 18445 1726882530.00992: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118490> <<< 18445 1726882530.01188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 18445 1726882530.01193: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 18445 1726882530.01211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 18445 1726882530.01223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 18445 1726882530.01246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 18445 1726882530.01272: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870cf190> <<< 18445 1726882530.01300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 18445 1726882530.01318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 18445 1726882530.01427: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870cf220> <<< 18445 1726882530.01459: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 18445 1726882530.01492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 18445 1726882530.01507: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870cf940> <<< 18445 1726882530.01555: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387130880> <<< 18445 1726882530.01584: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870c8d90> <<< 18445 1726882530.01643: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 18445 1726882530.01656: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870f2d90> <<< 18445 1726882530.01731: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118970> <<< 18445 1726882530.01765: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18445 1726882530.02314: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 18445 1726882530.02317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 18445 1726882530.02357: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 18445 1726882530.02360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 18445 1726882530.02385: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 18445 1726882530.02406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 18445 1726882530.02516: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 18445 1726882530.02593: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387093eb0> <<< 18445 1726882530.02614: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387096f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 18445 1726882530.02651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38708c610> <<< 18445 1726882530.02690: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387092640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387093370> <<< 18445 1726882530.02828: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 18445 1726882530.02832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882530.02843: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 18445 1726882530.02903: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386d4ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d4c910> <<< 18445 1726882530.02907: stdout chunk (state=3): >>>import 'itertools' # <<< 18445 1726882530.02925: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d4cf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 18445 1726882530.02961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 18445 1726882530.03090: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d4cfd0> <<< 18445 1726882530.03154: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5f0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38706ed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387067670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38707a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38709ae20> <<< 18445 1726882530.03181: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 18445 1726882530.03214: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386d5fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38706e2b0> <<< 18445 1726882530.03253: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.03290: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38707a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870a09d0> <<< 18445 1726882530.03308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 18445 1726882530.03325: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882530.03354: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 18445 1726882530.03393: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5fdf0> <<< 18445 1726882530.03407: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5fd60> <<< 18445 1726882530.03420: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 18445 1726882530.03457: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 18445 1726882530.03474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 18445 1726882530.03486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 18445 1726882530.03525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 18445 1726882530.03574: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d323d0> <<< 18445 1726882530.03590: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 18445 1726882530.03616: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d324c0> <<< 18445 1726882530.03739: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d66f40> <<< 18445 1726882530.03785: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d61a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d61490> <<< 18445 1726882530.03815: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 18445 1726882530.03866: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 18445 1726882530.03892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 18445 1726882530.03911: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c81220> <<< 18445 1726882530.03932: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d1d520> <<< 18445 1726882530.03986: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d61f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870a0040> <<< 18445 1726882530.04012: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 18445 1726882530.04049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 18445 1726882530.04078: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c92b50> <<< 18445 1726882530.04082: stdout chunk (state=3): >>>import 'errno' # <<< 18445 1726882530.04107: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c92e80> <<< 18445 1726882530.04163: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 18445 1726882530.04182: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 18445 1726882530.04199: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386ca4790> <<< 18445 1726882530.04203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 18445 1726882530.04223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 18445 1726882530.04251: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386ca4cd0> <<< 18445 1726882530.04293: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c31400> <<< 18445 1726882530.04319: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c92f70> <<< 18445 1726882530.04332: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 18445 1726882530.04380: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c422e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386ca4610> <<< 18445 1726882530.04402: stdout chunk (state=3): >>>import 'pwd' # <<< 18445 1726882530.04414: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c423a0> <<< 18445 1726882530.04450: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5fa30> <<< 18445 1726882530.04485: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 18445 1726882530.04511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 18445 1726882530.04524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 18445 1726882530.04562: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5d700> <<< 18445 1726882530.04598: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 18445 1726882530.04611: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5d9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c5d7c0> <<< 18445 1726882530.04635: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5d8b0> <<< 18445 1726882530.04657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 18445 1726882530.04675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 18445 1726882530.05394: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5dd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c68250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c5d940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c51a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5f610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c5daf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe386b7f6d0> <<< 18445 1726882530.05714: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip' # zipimport: zlib available <<< 18445 1726882530.05972: stdout chunk (state=3): >>># zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 18445 1726882530.07878: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.09468: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 18445 1726882530.09506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 18445 1726882530.09509: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864bb820> <<< 18445 1726882530.09543: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py<<< 18445 1726882530.09566: stdout chunk (state=3): >>> <<< 18445 1726882530.09569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882530.09619: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py<<< 18445 1726882530.09637: stdout chunk (state=3): >>> <<< 18445 1726882530.09646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 18445 1726882530.09675: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 18445 1726882530.09688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 18445 1726882530.09751: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so'<<< 18445 1726882530.09775: stdout chunk (state=3): >>> <<< 18445 1726882530.09778: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386549730> <<< 18445 1726882530.09842: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549610><<< 18445 1726882530.09847: stdout chunk (state=3): >>> <<< 18445 1726882530.09895: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549340> <<< 18445 1726882530.09939: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py<<< 18445 1726882530.09969: stdout chunk (state=3): >>> <<< 18445 1726882530.09972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 18445 1726882530.10028: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549460> <<< 18445 1726882530.10058: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549160> <<< 18445 1726882530.10081: stdout chunk (state=3): >>>import 'atexit' # <<< 18445 1726882530.10123: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so'<<< 18445 1726882530.10152: stdout chunk (state=3): >>> # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3865493a0><<< 18445 1726882530.10155: stdout chunk (state=3): >>> <<< 18445 1726882530.10198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py<<< 18445 1726882530.10202: stdout chunk (state=3): >>> <<< 18445 1726882530.10244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 18445 1726882530.10327: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549790> <<< 18445 1726882530.10369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py<<< 18445 1726882530.10372: stdout chunk (state=3): >>> <<< 18445 1726882530.10418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc'<<< 18445 1726882530.10431: stdout chunk (state=3): >>> <<< 18445 1726882530.10444: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 18445 1726882530.10487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 18445 1726882530.10545: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 18445 1726882530.10548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 18445 1726882530.10688: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386539820> <<< 18445 1726882530.10754: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so'<<< 18445 1726882530.10784: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386539490> <<< 18445 1726882530.10828: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.10865: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386539640><<< 18445 1726882530.10868: stdout chunk (state=3): >>> <<< 18445 1726882530.10912: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 18445 1726882530.10924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 18445 1726882530.10984: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38643f520> <<< 18445 1726882530.11019: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386544d60> <<< 18445 1726882530.11314: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865494f0><<< 18445 1726882530.11324: stdout chunk (state=3): >>> <<< 18445 1726882530.11359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 18445 1726882530.11375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 18445 1726882530.11406: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865441c0> <<< 18445 1726882530.11462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py<<< 18445 1726882530.11470: stdout chunk (state=3): >>> <<< 18445 1726882530.11481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 18445 1726882530.11538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 18445 1726882530.11541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 18445 1726882530.11575: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 18445 1726882530.11600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 18445 1726882530.11649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py<<< 18445 1726882530.11675: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 18445 1726882530.11691: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386548b20> <<< 18445 1726882530.11828: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386518160> <<< 18445 1726882530.11831: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386518760> <<< 18445 1726882530.11842: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386445d30> <<< 18445 1726882530.11906: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.11921: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386518670> <<< 18445 1726882530.11966: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py<<< 18445 1726882530.11995: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38659ad00> <<< 18445 1726882530.12035: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 18445 1726882530.12072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 18445 1726882530.12111: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 18445 1726882530.12145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 18445 1726882530.12229: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38649ca00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865a4e80> <<< 18445 1726882530.12252: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 18445 1726882530.12329: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.12334: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864aa0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865a4eb0> <<< 18445 1726882530.12349: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 18445 1726882530.12374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882530.12409: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 18445 1726882530.12412: stdout chunk (state=3): >>>import '_string' # <<< 18445 1726882530.12467: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865ac250> <<< 18445 1726882530.12611: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864aa0d0> <<< 18445 1726882530.12728: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.12742: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3865aca60> <<< 18445 1726882530.12808: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.12824: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38656eb80> <<< 18445 1726882530.12905: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.12924: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3865a4cd0> <<< 18445 1726882530.12946: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38659aee0> <<< 18445 1726882530.12990: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 18445 1726882530.13011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 18445 1726882530.13032: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 18445 1726882530.13082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc'<<< 18445 1726882530.13086: stdout chunk (state=3): >>> <<< 18445 1726882530.13164: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.13181: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864a60d0> <<< 18445 1726882530.13500: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so'<<< 18445 1726882530.13544: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38649d310> <<< 18445 1726882530.13560: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864a6cd0> <<< 18445 1726882530.13602: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.13637: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864a6670> <<< 18445 1726882530.13654: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864a7100> <<< 18445 1726882530.13694: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.13718: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.13731: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 18445 1726882530.13770: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.13773: stdout chunk (state=3): >>> <<< 18445 1726882530.13890: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.14026: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.14050: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.14077: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 18445 1726882530.14096: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.14130: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.14143: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 18445 1726882530.14169: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.14334: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.14489: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.15269: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.16049: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py<<< 18445 1726882530.16053: stdout chunk (state=3): >>> <<< 18445 1726882530.16095: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 18445 1726882530.16113: stdout chunk (state=3): >>> import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 18445 1726882530.16145: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 18445 1726882530.16148: stdout chunk (state=3): >>> <<< 18445 1726882530.16181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882530.16292: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864e5910><<< 18445 1726882530.16295: stdout chunk (state=3): >>> <<< 18445 1726882530.16412: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 18445 1726882530.16470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 18445 1726882530.16485: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864ea9a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386068640> <<< 18445 1726882530.16871: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available <<< 18445 1726882530.17070: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 18445 1726882530.17085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 18445 1726882530.17132: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865207f0><<< 18445 1726882530.17136: stdout chunk (state=3): >>> <<< 18445 1726882530.17150: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.17802: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.18428: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.18431: stdout chunk (state=3): >>> <<< 18445 1726882530.18530: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.18649: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/collections.py<<< 18445 1726882530.18652: stdout chunk (state=3): >>> <<< 18445 1726882530.18666: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.18719: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.18723: stdout chunk (state=3): >>> <<< 18445 1726882530.18782: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 18445 1726882530.18796: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.18890: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.19007: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 18445 1726882530.19047: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.19071: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.19098: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 18445 1726882530.19114: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.19169: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.19225: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 18445 1726882530.19249: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.19562: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.19890: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py<<< 18445 1726882530.19894: stdout chunk (state=3): >>> <<< 18445 1726882530.19947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 18445 1726882530.19974: stdout chunk (state=3): >>>import '_ast' # <<< 18445 1726882530.20079: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386566460> <<< 18445 1726882530.20105: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.20204: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.20311: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 18445 1726882530.20337: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 18445 1726882530.20378: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py<<< 18445 1726882530.20397: stdout chunk (state=3): >>> <<< 18445 1726882530.20412: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.20466: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.20552: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 18445 1726882530.20940: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 18445 1726882530.21022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc'<<< 18445 1726882530.21025: stdout chunk (state=3): >>> <<< 18445 1726882530.21104: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.21118: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864da0d0> <<< 18445 1726882530.21282: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864ea1f0> <<< 18445 1726882530.21341: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/file.py <<< 18445 1726882530.21369: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/process.py<<< 18445 1726882530.21392: stdout chunk (state=3): >>> <<< 18445 1726882530.21395: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.21467: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.21561: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.21566: stdout chunk (state=3): >>> <<< 18445 1726882530.21607: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.21610: stdout chunk (state=3): >>> <<< 18445 1726882530.21668: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 18445 1726882530.21707: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc'<<< 18445 1726882530.21710: stdout chunk (state=3): >>> <<< 18445 1726882530.21741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 18445 1726882530.21800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 18445 1726882530.21843: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py<<< 18445 1726882530.21846: stdout chunk (state=3): >>> <<< 18445 1726882530.21887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc'<<< 18445 1726882530.21891: stdout chunk (state=3): >>> <<< 18445 1726882530.22025: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864ecbb0> <<< 18445 1726882530.22105: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865b5070><<< 18445 1726882530.22108: stdout chunk (state=3): >>> <<< 18445 1726882530.22235: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864dd2e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py<<< 18445 1726882530.22239: stdout chunk (state=3): >>> <<< 18445 1726882530.22251: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.22301: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.22320: stdout chunk (state=3): >>> <<< 18445 1726882530.22347: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 18445 1726882530.22467: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 18445 1726882530.22492: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.22528: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.22551: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 18445 1726882530.22573: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.22670: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.22673: stdout chunk (state=3): >>> <<< 18445 1726882530.22750: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.22786: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.22829: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.22832: stdout chunk (state=3): >>> <<< 18445 1726882530.22892: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.22957: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.22960: stdout chunk (state=3): >>> <<< 18445 1726882530.23015: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23080: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py<<< 18445 1726882530.23087: stdout chunk (state=3): >>> <<< 18445 1726882530.23098: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23203: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23319: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23359: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23411: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 18445 1726882530.23447: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23683: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23920: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.23988: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.24069: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882530.24107: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 18445 1726882530.24135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 18445 1726882530.24202: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 18445 1726882530.24249: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38601b400> <<< 18445 1726882530.24287: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 18445 1726882530.24317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 18445 1726882530.24354: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py<<< 18445 1726882530.24357: stdout chunk (state=3): >>> <<< 18445 1726882530.24412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc'<<< 18445 1726882530.24415: stdout chunk (state=3): >>> <<< 18445 1726882530.24443: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 18445 1726882530.24486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc'<<< 18445 1726882530.24494: stdout chunk (state=3): >>> <<< 18445 1726882530.24506: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38607a9a0> <<< 18445 1726882530.24573: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so'<<< 18445 1726882530.24604: stdout chunk (state=3): >>> # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38607adf0><<< 18445 1726882530.24607: stdout chunk (state=3): >>> <<< 18445 1726882530.24713: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386078490><<< 18445 1726882530.24722: stdout chunk (state=3): >>> <<< 18445 1726882530.24742: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385ef3040> <<< 18445 1726882530.24802: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de33a0><<< 18445 1726882530.24807: stdout chunk (state=3): >>> <<< 18445 1726882530.24822: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de35e0> <<< 18445 1726882530.24847: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 18445 1726882530.24890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 18445 1726882530.24933: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py<<< 18445 1726882530.24952: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 18445 1726882530.25008: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so'<<< 18445 1726882530.25044: stdout chunk (state=3): >>> # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864d96d0> <<< 18445 1726882530.25065: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386088730> <<< 18445 1726882530.25099: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 18445 1726882530.25131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc'<<< 18445 1726882530.25135: stdout chunk (state=3): >>> <<< 18445 1726882530.25177: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864d95e0><<< 18445 1726882530.25179: stdout chunk (state=3): >>> <<< 18445 1726882530.25214: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py<<< 18445 1726882530.25217: stdout chunk (state=3): >>> <<< 18445 1726882530.25252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 18445 1726882530.25333: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882530.25337: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3860343a0> <<< 18445 1726882530.25379: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385e429a0><<< 18445 1726882530.25386: stdout chunk (state=3): >>> <<< 18445 1726882530.25421: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de34f0><<< 18445 1726882530.25434: stdout chunk (state=3): >>> <<< 18445 1726882530.25437: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py<<< 18445 1726882530.25442: stdout chunk (state=3): >>> <<< 18445 1726882530.25491: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 18445 1726882530.25539: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.25568: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.25589: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 18445 1726882530.25593: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.25662: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.25728: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 18445 1726882530.25805: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.25828: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.25937: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 18445 1726882530.26036: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.26053: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 18445 1726882530.26140: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 18445 1726882530.26847: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.26865: stdout chunk (state=3): >>> import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 18445 1726882530.27493: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28152: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 18445 1726882530.28189: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28270: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.28273: stdout chunk (state=3): >>> <<< 18445 1726882530.28343: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28413: stdout chunk (state=3): >>># zipimport: zlib available<<< 18445 1726882530.28416: stdout chunk (state=3): >>> <<< 18445 1726882530.28461: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 18445 1726882530.28480: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 18445 1726882530.28507: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28545: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28603: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 18445 1726882530.28605: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28675: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28730: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 18445 1726882530.28750: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28795: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28837: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 18445 1726882530.28850: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28897: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.28942: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 18445 1726882530.28955: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.29050: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.29161: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 18445 1726882530.29215: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de39d0> <<< 18445 1726882530.29233: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 18445 1726882530.29259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 18445 1726882530.29526: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385d66f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 18445 1726882530.29549: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.29614: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.29691: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 18445 1726882530.29706: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.29806: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.29922: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 18445 1726882530.29925: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.29997: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.30104: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 18445 1726882530.30172: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.30205: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 18445 1726882530.30229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 18445 1726882530.30381: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe385d483a0> <<< 18445 1726882530.30640: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385da8100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 18445 1726882530.30644: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.30690: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.30736: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 18445 1726882530.30816: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.30897: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.30972: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.31123: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 18445 1726882530.31153: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.31192: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 18445 1726882530.31239: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.31288: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 18445 1726882530.31333: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe385cee6a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385ceea90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 18445 1726882530.31367: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18445 1726882530.31876: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 18445 1726882530.32016: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.32169: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.32192: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.32225: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 18445 1726882530.32672: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18445 1726882530.32753: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 18445 1726882530.32791: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 18445 1726882530.32812: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.32979: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.33168: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 18445 1726882530.33210: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.33266: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.33994: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.34442: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 18445 1726882530.34446: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 18445 1726882530.34448: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.34535: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.34649: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 18445 1726882530.34653: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.34718: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.34799: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 18445 1726882530.34931: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35091: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 18445 1726882530.35094: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35096: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35098: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 18445 1726882530.35100: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35131: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35176: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 18445 1726882530.35179: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35259: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35342: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35509: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35677: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 18445 1726882530.35690: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35714: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35753: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 18445 1726882530.35760: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35781: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35806: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 18445 1726882530.35809: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35867: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35928: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 18445 1726882530.35956: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.35984: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 18445 1726882530.35987: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36034: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36093: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 18445 1726882530.36096: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36137: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36192: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 18445 1726882530.36409: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36630: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 18445 1726882530.36634: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36677: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36728: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 18445 1726882530.36766: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36797: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 18445 1726882530.36800: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36823: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36860: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 18445 1726882530.36895: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36925: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 18445 1726882530.36928: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.36997: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37072: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 18445 1726882530.37102: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 18445 1726882530.37106: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37146: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37208: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 18445 1726882530.37211: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37217: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37232: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37271: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37318: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37370: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37443: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 18445 1726882530.37446: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 18445 1726882530.37490: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37540: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 18445 1726882530.37543: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37699: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37858: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 18445 1726882530.37901: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37951: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 18445 1726882530.37955: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.37992: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.38030: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 18445 1726882530.38036: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.38113: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882530.38181: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 18445 1726882530.38579: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available <<< 18445 1726882530.38955: stdout chunk (state=3): >>>import 'gc' # <<< 18445 1726882530.40152: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 18445 1726882530.40203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 18445 1726882530.40206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 18445 1726882530.40257: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe385cd05e0> <<< 18445 1726882530.40260: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385cd5790> <<< 18445 1726882530.40348: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385cd5040> <<< 18445 1726882530.41671: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "30", "epoch": "1726882530", "epoch_int": "1726882530", "date": "2024-09-20", "time": "21:35:30", "iso8601_micro": "2024-09-21T01:35:30.386114Z", "iso8601": "2024-09-21T01:35:30Z", "iso8601_basic": "20240920T213530386114", "iso8601_basic_short": "20240920T213530", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_pu<<< 18445 1726882530.41676: stdout chunk (state=3): >>>blic_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18445 1726882530.42384: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 18445 1726882530.42387: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site <<< 18445 1726882530.42575: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils <<< 18445 1726882530.42590: stdout chunk (state=3): >>># destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns <<< 18445 1726882530.42673: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system<<< 18445 1726882530.42789: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps <<< 18445 1726882530.42792: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 18445 1726882530.43085: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 18445 1726882530.43125: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 18445 1726882530.43156: stdout chunk (state=3): >>># destroy zipimport <<< 18445 1726882530.43173: stdout chunk (state=3): >>># destroy _compression <<< 18445 1726882530.43258: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 18445 1726882530.43331: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 18445 1726882530.43349: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 18445 1726882530.43365: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 18445 1726882530.43417: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 18445 1726882530.43484: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 18445 1726882530.43499: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 18445 1726882530.43556: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process <<< 18445 1726882530.43559: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 18445 1726882530.43585: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 18445 1726882530.43609: stdout chunk (state=3): >>># destroy base64 <<< 18445 1726882530.43639: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 18445 1726882530.43677: stdout chunk (state=3): >>># destroy json <<< 18445 1726882530.43691: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 18445 1726882530.43770: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios <<< 18445 1726882530.43844: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal <<< 18445 1726882530.43897: stdout chunk (state=3): >>># cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 18445 1726882530.43958: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 18445 1726882530.44062: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 18445 1726882530.44088: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 18445 1726882530.44103: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 18445 1726882530.44179: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket <<< 18445 1726882530.44182: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 18445 1726882530.44403: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 18445 1726882530.44436: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 18445 1726882530.44470: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess <<< 18445 1726882530.44511: stdout chunk (state=3): >>># destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 18445 1726882530.44544: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 18445 1726882530.44602: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 18445 1726882530.45106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 18445 1726882530.45115: stdout chunk (state=3): >>><<< 18445 1726882530.45127: stderr chunk (state=3): >>><<< 18445 1726882530.45295: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387173dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3871183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387173b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387173ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387130880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387118970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387093eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387096f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38708c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387092640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387093370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386d4ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d4c910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d4cf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d4cfd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5f0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38706ed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe387067670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38707a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38709ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386d5fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38706e2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38707a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870a09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5fdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5fd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d323d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d324c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d66f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d61a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d61490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c81220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d1d520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d61f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3870a0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c92b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c92e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386ca4790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386ca4cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c31400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c92f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c422e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386ca4610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c423a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5fa30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5d700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5d9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c5d7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5d8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c5dd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386c68250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c5d940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c51a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386d5f610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386c5daf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe386b7f6d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864bb820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386549730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3865493a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386549790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386539820> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386539490> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386539640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38643f520> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386544d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865494f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865441c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386548b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386518160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386518760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386445d30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe386518670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38659ad00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38649ca00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865a4e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864aa0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865a4eb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865ac250> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864aa0d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3865aca60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38656eb80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3865a4cd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38659aee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864a60d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38649d310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864a6cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864a6670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864a7100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864e5910> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864ea9a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386068640> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865207f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386566460> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864da0d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864ea1f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864ecbb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3865b5070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864dd2e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38601b400> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe38607a9a0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe38607adf0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386078490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385ef3040> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de33a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de35e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3864d96d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe386088730> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe3864d95e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe3860343a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385e429a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de34f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385de39d0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385d66f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe385d483a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385da8100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe385cee6a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385ceea90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_wrcfjj8b/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe385cd05e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385cd5790> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe385cd5040> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "30", "epoch": "1726882530", "epoch_int": "1726882530", "date": "2024-09-20", "time": "21:35:30", "iso8601_micro": "2024-09-21T01:35:30.386114Z", "iso8601": "2024-09-21T01:35:30Z", "iso8601_basic": "20240920T213530386114", "iso8601_basic_short": "20240920T213530", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 18445 1726882530.47469: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18445 1726882530.47473: _low_level_execute_command(): starting 18445 1726882530.47475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882529.8270774-18532-206561677547639/ > /dev/null 2>&1 && sleep 0' 18445 1726882530.48597: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882530.48851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.48854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.48857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.48860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.49038: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882530.49041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.49044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882530.49046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882530.49048: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882530.49050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.49052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.49054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.49056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.49058: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882530.49060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.49134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882530.49148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882530.49153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882530.49379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882530.51175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882530.51182: stdout chunk (state=3): >>><<< 18445 1726882530.51206: stderr chunk (state=3): >>><<< 18445 1726882530.51211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882530.51219: handler run complete 18445 1726882530.51288: variable 'ansible_facts' from source: unknown 18445 1726882530.51341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882530.51468: variable 'ansible_facts' from source: unknown 18445 1726882530.51517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882530.51578: attempt loop complete, returning result 18445 1726882530.51581: _execute() done 18445 1726882530.51583: dumping result to json 18445 1726882530.51596: done dumping result, returning 18445 1726882530.51605: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-f6eb-935c-00000000008d] 18445 1726882530.51610: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000008d 18445 1726882530.51774: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000008d 18445 1726882530.51777: WORKER PROCESS EXITING ok: [managed_node1] 18445 1726882530.51908: no more pending results, returning what we have 18445 1726882530.51911: results queue empty 18445 1726882530.51912: checking for any_errors_fatal 18445 1726882530.51913: done checking for any_errors_fatal 18445 1726882530.51914: checking for max_fail_percentage 18445 1726882530.51916: done checking for max_fail_percentage 18445 1726882530.51916: checking to see if all hosts have failed and the running result is not ok 18445 1726882530.51917: done checking to see if all hosts have failed 18445 1726882530.51918: getting the remaining hosts for this loop 18445 1726882530.51919: done getting the remaining hosts for this loop 18445 1726882530.51922: getting the next task for host managed_node1 18445 1726882530.51930: done getting next task for host managed_node1 18445 1726882530.51932: ^ task is: TASK: Check if system is ostree 18445 1726882530.51935: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882530.51938: getting variables 18445 1726882530.51939: in VariableManager get_vars() 18445 1726882530.51961: Calling all_inventory to load vars for managed_node1 18445 1726882530.51965: Calling groups_inventory to load vars for managed_node1 18445 1726882530.51969: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882530.51979: Calling all_plugins_play to load vars for managed_node1 18445 1726882530.51982: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882530.51984: Calling groups_plugins_play to load vars for managed_node1 18445 1726882530.52122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882530.52313: done with get_vars() 18445 1726882530.52324: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:35:30 -0400 (0:00:00.768) 0:00:02.461 ****** 18445 1726882530.52438: entering _queue_task() for managed_node1/stat 18445 1726882530.52705: worker is 1 (out of 1 available) 18445 1726882530.52723: exiting _queue_task() for managed_node1/stat 18445 1726882530.52735: done queuing things up, now waiting for results queue to drain 18445 1726882530.52736: waiting for pending results... 18445 1726882530.53008: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 18445 1726882530.54473: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000008f 18445 1726882530.54549: variable 'ansible_search_path' from source: unknown 18445 1726882530.54557: variable 'ansible_search_path' from source: unknown 18445 1726882530.54599: calling self._execute() 18445 1726882530.54716: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882530.54873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882530.54891: variable 'omit' from source: magic vars 18445 1726882530.55790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18445 1726882530.56418: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18445 1726882530.56468: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18445 1726882530.56521: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18445 1726882530.56634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18445 1726882530.56843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18445 1726882530.56876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18445 1726882530.56906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882530.57059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18445 1726882530.57301: Evaluated conditional (not __network_is_ostree is defined): True 18445 1726882530.57311: variable 'omit' from source: magic vars 18445 1726882530.57346: variable 'omit' from source: magic vars 18445 1726882530.57401: variable 'omit' from source: magic vars 18445 1726882530.57498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18445 1726882530.57528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18445 1726882530.57597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18445 1726882530.57702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882530.57716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882530.57746: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18445 1726882530.57754: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882530.57761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882530.57973: Set connection var ansible_shell_type to sh 18445 1726882530.57986: Set connection var ansible_module_compression to ZIP_DEFLATED 18445 1726882530.57997: Set connection var ansible_connection to ssh 18445 1726882530.58024: Set connection var ansible_pipelining to False 18445 1726882530.58128: Set connection var ansible_shell_executable to /bin/sh 18445 1726882530.58139: Set connection var ansible_timeout to 10 18445 1726882530.58167: variable 'ansible_shell_executable' from source: unknown 18445 1726882530.58177: variable 'ansible_connection' from source: unknown 18445 1726882530.58184: variable 'ansible_module_compression' from source: unknown 18445 1726882530.58191: variable 'ansible_shell_type' from source: unknown 18445 1726882530.58197: variable 'ansible_shell_executable' from source: unknown 18445 1726882530.58203: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882530.58210: variable 'ansible_pipelining' from source: unknown 18445 1726882530.58218: variable 'ansible_timeout' from source: unknown 18445 1726882530.58232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882530.58495: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 18445 1726882530.58666: variable 'omit' from source: magic vars 18445 1726882530.58682: starting attempt loop 18445 1726882530.58689: running the handler 18445 1726882530.58707: _low_level_execute_command(): starting 18445 1726882530.58718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18445 1726882530.60534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882530.60550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.60623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.60642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.60689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.60703: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882530.60721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.60747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882530.60844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882530.60858: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882530.60875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.60891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.60907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.60921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.60934: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882530.60956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.61034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882530.61088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882530.61110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882530.61287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882530.62961: stdout chunk (state=3): >>>/root <<< 18445 1726882530.63167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882530.63172: stdout chunk (state=3): >>><<< 18445 1726882530.63176: stderr chunk (state=3): >>><<< 18445 1726882530.63296: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882530.63307: _low_level_execute_command(): starting 18445 1726882530.63310: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882 `" && echo ansible-tmp-1726882530.6319895-18572-103421101224882="` echo /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882 `" ) && sleep 0' 18445 1726882530.64503: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882530.64519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.64534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.64553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.64597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.64617: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882530.64632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.64649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882530.64665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882530.64678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882530.64690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.64704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.64727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.64740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.64751: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882530.64767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.64848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882530.64873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882530.64890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882530.65026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18445 1726882530.66902: stdout chunk (state=3): >>>ansible-tmp-1726882530.6319895-18572-103421101224882=/root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882 <<< 18445 1726882530.67089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882530.67092: stdout chunk (state=3): >>><<< 18445 1726882530.67105: stderr chunk (state=3): >>><<< 18445 1726882530.67278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882530.6319895-18572-103421101224882=/root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18445 1726882530.67282: variable 'ansible_module_compression' from source: unknown 18445 1726882530.67284: ANSIBALLZ: Using lock for stat 18445 1726882530.67286: ANSIBALLZ: Acquiring lock 18445 1726882530.67288: ANSIBALLZ: Lock acquired: 140250087338784 18445 1726882530.67290: ANSIBALLZ: Creating module 18445 1726882530.87069: ANSIBALLZ: Writing module into payload 18445 1726882530.87185: ANSIBALLZ: Writing module 18445 1726882530.87213: ANSIBALLZ: Renaming module 18445 1726882530.87223: ANSIBALLZ: Done creating module 18445 1726882530.87244: variable 'ansible_facts' from source: unknown 18445 1726882530.87310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882/AnsiballZ_stat.py 18445 1726882530.87466: Sending initial data 18445 1726882530.87469: Sent initial data (153 bytes) 18445 1726882530.88420: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882530.88435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.88451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.88477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.88520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.88533: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882530.88548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.88573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882530.88586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882530.88598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882530.88610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.88624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.88639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.88652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.88672: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882530.88686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.88767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882530.88791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882530.88809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882530.88943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18445 1726882530.91393: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18445 1726882530.91487: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 18445 1726882530.91592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18445x1hycoyh/tmpr399y8rx /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882/AnsiballZ_stat.py <<< 18445 1726882530.91687: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 18445 1726882530.93031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882530.93153: stderr chunk (state=3): >>><<< 18445 1726882530.93157: stdout chunk (state=3): >>><<< 18445 1726882530.93175: done transferring module to remote 18445 1726882530.93187: _low_level_execute_command(): starting 18445 1726882530.93192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882/ /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882/AnsiballZ_stat.py && sleep 0' 18445 1726882530.93665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.93670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.93702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 18445 1726882530.93706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.93709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.93762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882530.93771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882530.93872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18445 1726882530.96381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882530.96453: stderr chunk (state=3): >>><<< 18445 1726882530.96460: stdout chunk (state=3): >>><<< 18445 1726882530.96559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18445 1726882530.96566: _low_level_execute_command(): starting 18445 1726882530.96569: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882/AnsiballZ_stat.py && sleep 0' 18445 1726882530.97190: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882530.97229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882530.97232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882530.97234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882530.97269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.97273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882530.97288: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882530.97342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882530.97368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882530.97371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882530.97484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18445 1726882531.00171: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18445 1726882531.00173: stdout chunk (state=3): >>>import _imp # builtin <<< 18445 1726882531.00209: stdout chunk (state=3): >>>import '_thread' # <<< 18445 1726882531.00212: stdout chunk (state=3): >>>import '_warnings' # <<< 18445 1726882531.00214: stdout chunk (state=3): >>>import '_weakref' # <<< 18445 1726882531.00318: stdout chunk (state=3): >>>import '_io' # <<< 18445 1726882531.00331: stdout chunk (state=3): >>>import 'marshal' # <<< 18445 1726882531.00378: stdout chunk (state=3): >>>import 'posix' # <<< 18445 1726882531.00423: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 18445 1726882531.00429: stdout chunk (state=3): >>># installing zipimport hook <<< 18445 1726882531.00482: stdout chunk (state=3): >>>import 'time' # <<< 18445 1726882531.00507: stdout chunk (state=3): >>>import 'zipimport' # <<< 18445 1726882531.00510: stdout chunk (state=3): >>># installed zipimport hook <<< 18445 1726882531.00575: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 18445 1726882531.00591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882531.00617: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 18445 1726882531.00645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 18445 1726882531.00667: stdout chunk (state=3): >>>import '_codecs' # <<< 18445 1726882531.00699: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6a43dc0> <<< 18445 1726882531.00745: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 18445 1726882531.00776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 18445 1726882531.00786: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d83a0> <<< 18445 1726882531.00798: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6a43b20> <<< 18445 1726882531.00826: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 18445 1726882531.00832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 18445 1726882531.00862: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6a43ac0> <<< 18445 1726882531.00899: stdout chunk (state=3): >>>import '_signal' # <<< 18445 1726882531.00932: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 18445 1726882531.00944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 18445 1726882531.00950: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8490> <<< 18445 1726882531.00989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 18445 1726882531.01034: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 18445 1726882531.01037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 18445 1726882531.01069: stdout chunk (state=3): >>>import '_abc' # <<< 18445 1726882531.01086: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8940> <<< 18445 1726882531.01126: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8670> <<< 18445 1726882531.01177: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 18445 1726882531.01199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 18445 1726882531.01234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py<<< 18445 1726882531.01236: stdout chunk (state=3): >>> <<< 18445 1726882531.01282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 18445 1726882531.01298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 18445 1726882531.01328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc'<<< 18445 1726882531.01333: stdout chunk (state=3): >>> <<< 18445 1726882531.01375: stdout chunk (state=3): >>>import '_stat' # <<< 18445 1726882531.01382: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b678f190><<< 18445 1726882531.01388: stdout chunk (state=3): >>> <<< 18445 1726882531.01411: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 18445 1726882531.01449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 18445 1726882531.01573: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b678f220><<< 18445 1726882531.01578: stdout chunk (state=3): >>> <<< 18445 1726882531.01615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py<<< 18445 1726882531.01618: stdout chunk (state=3): >>> <<< 18445 1726882531.01621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 18445 1726882531.01665: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py<<< 18445 1726882531.01683: stdout chunk (state=3): >>> <<< 18445 1726882531.01699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 18445 1726882531.01702: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67b2850> <<< 18445 1726882531.01706: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b678f940> <<< 18445 1726882531.01753: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67f0880><<< 18445 1726882531.01760: stdout chunk (state=3): >>> <<< 18445 1726882531.01799: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py<<< 18445 1726882531.01815: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc'<<< 18445 1726882531.01819: stdout chunk (state=3): >>> <<< 18445 1726882531.01825: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6787d90> <<< 18445 1726882531.01898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py<<< 18445 1726882531.01912: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 18445 1726882531.01935: stdout chunk (state=3): >>>import '_locale' # <<< 18445 1726882531.01942: stdout chunk (state=3): >>> import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67b2d90> <<< 18445 1726882531.02028: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8970><<< 18445 1726882531.02035: stdout chunk (state=3): >>> <<< 18445 1726882531.02077: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) <<< 18445 1726882531.02081: stdout chunk (state=3): >>> [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information.<<< 18445 1726882531.02086: stdout chunk (state=3): >>> <<< 18445 1726882531.02419: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 18445 1726882531.02435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 18445 1726882531.02487: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 18445 1726882531.02500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 18445 1726882531.02522: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 18445 1726882531.02560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 18445 1726882531.02595: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 18445 1726882531.02621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 18445 1726882531.02645: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6753eb0> <<< 18445 1726882531.02731: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6756f40> <<< 18445 1726882531.02758: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 18445 1726882531.02785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 18445 1726882531.02819: stdout chunk (state=3): >>>import '_sre' # <<< 18445 1726882531.02849: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 18445 1726882531.02883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 18445 1726882531.02940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 18445 1726882531.02977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b674c610> <<< 18445 1726882531.03016: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6752640> <<< 18445 1726882531.03057: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6753370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 18445 1726882531.03198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 18445 1726882531.03210: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py<<< 18445 1726882531.03379: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b66d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66d4910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66d4f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 18445 1726882531.03399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 18445 1726882531.03402: stdout chunk (state=3): >>>import '_operator' # <<< 18445 1726882531.03408: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66d4fd0> <<< 18445 1726882531.03437: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 18445 1726882531.03440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 18445 1726882531.03447: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e70d0> <<< 18445 1726882531.03471: stdout chunk (state=3): >>>import '_collections' # <<< 18445 1726882531.03517: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b672ed90> <<< 18445 1726882531.03536: stdout chunk (state=3): >>>import '_functools' # <<< 18445 1726882531.03560: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6727670> <<< 18445 1726882531.03630: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b673a6d0> <<< 18445 1726882531.03640: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b675ae20> <<< 18445 1726882531.03670: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 18445 1726882531.03707: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b66e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b672e2b0> <<< 18445 1726882531.03750: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882531.03753: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b673a2e0> <<< 18445 1726882531.03758: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67609d0> <<< 18445 1726882531.03787: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 18445 1726882531.03827: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882531.03841: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 18445 1726882531.03867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 18445 1726882531.03875: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7df0> <<< 18445 1726882531.03913: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 18445 1726882531.03924: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7d60> <<< 18445 1726882531.03942: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 18445 1726882531.03947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 18445 1726882531.03974: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 18445 1726882531.03981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 18445 1726882531.04006: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 18445 1726882531.04092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 18445 1726882531.04121: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py <<< 18445 1726882531.04126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66b13d0> <<< 18445 1726882531.04145: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 18445 1726882531.04170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 18445 1726882531.04206: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66b14c0> <<< 18445 1726882531.04388: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66eef40> <<< 18445 1726882531.04434: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e9a90> <<< 18445 1726882531.04452: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e9490> <<< 18445 1726882531.04472: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 18445 1726882531.04491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 18445 1726882531.04528: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 18445 1726882531.04545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 18445 1726882531.04572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 18445 1726882531.04583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 18445 1726882531.04588: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65cb220> <<< 18445 1726882531.04626: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b669c520> <<< 18445 1726882531.04698: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e9f10> <<< 18445 1726882531.04703: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6760040> <<< 18445 1726882531.04724: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 18445 1726882531.04758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 18445 1726882531.04792: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 18445 1726882531.04796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ddb50> <<< 18445 1726882531.04802: stdout chunk (state=3): >>>import 'errno' # <<< 18445 1726882531.04837: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65dde80> <<< 18445 1726882531.04859: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 18445 1726882531.04871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 18445 1726882531.04900: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 18445 1726882531.04910: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ee790> <<< 18445 1726882531.04930: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 18445 1726882531.04971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 18445 1726882531.05012: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65eecd0> <<< 18445 1726882531.05043: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b657c400> <<< 18445 1726882531.05072: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ddf70> <<< 18445 1726882531.05122: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 18445 1726882531.05165: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b658d2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ee610> <<< 18445 1726882531.05178: stdout chunk (state=3): >>>import 'pwd' # <<< 18445 1726882531.05220: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b658d3a0> <<< 18445 1726882531.05288: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7a30> <<< 18445 1726882531.05650: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 18445 1726882531.05657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a8700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a89d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65a87c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a88b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 18445 1726882531.05782: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a8d00> <<< 18445 1726882531.05810: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882531.05821: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65b3250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65a8940> <<< 18445 1726882531.05842: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b659ca90> <<< 18445 1726882531.05868: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7610> <<< 18445 1726882531.05899: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 18445 1726882531.05973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 18445 1726882531.06017: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65a8af0> <<< 18445 1726882531.06148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 18445 1726882531.06177: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f50b64d06d0> <<< 18445 1726882531.06441: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip' # zipimport: zlib available <<< 18445 1726882531.06579: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.06620: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 18445 1726882531.06659: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.06683: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 18445 1726882531.08579: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.10179: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63ce820> <<< 18445 1726882531.10194: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882531.10257: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 18445 1726882531.10293: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b645d730> <<< 18445 1726882531.10357: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d610> <<< 18445 1726882531.10399: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d340> <<< 18445 1726882531.10439: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 18445 1726882531.10442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 18445 1726882531.10514: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d160> <<< 18445 1726882531.10517: stdout chunk (state=3): >>>import 'atexit' # <<< 18445 1726882531.10545: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b645d3a0> <<< 18445 1726882531.10569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 18445 1726882531.10602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 18445 1726882531.10661: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d790> <<< 18445 1726882531.10683: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 18445 1726882531.10697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 18445 1726882531.10713: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 18445 1726882531.10746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 18445 1726882531.10774: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 18445 1726882531.10778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 18445 1726882531.10876: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b634e7f0> <<< 18445 1726882531.11014: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b634eb80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b634e9d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 18445 1726882531.11017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 18445 1726882531.11059: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b636daf0> <<< 18445 1726882531.11075: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6457d60> <<< 18445 1726882531.11329: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d4f0> <<< 18445 1726882531.11359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 18445 1726882531.11387: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b64571c0> <<< 18445 1726882531.11406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 18445 1726882531.11437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 18445 1726882531.11453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 18445 1726882531.11493: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 18445 1726882531.11514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 18445 1726882531.11526: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63cab20> <<< 18445 1726882531.11650: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6400eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b64008b0> <<< 18445 1726882531.11662: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63672e0> <<< 18445 1726882531.11695: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b64009a0> <<< 18445 1726882531.11733: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b642ed00> <<< 18445 1726882531.11772: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 18445 1726882531.11775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 18445 1726882531.11805: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 18445 1726882531.11836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 18445 1726882531.11944: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b632fa00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6436e80><<< 18445 1726882531.11948: stdout chunk (state=3): >>> <<< 18445 1726882531.11968: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 18445 1726882531.11980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 18445 1726882531.12045: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 18445 1726882531.12050: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633e0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6436eb0> <<< 18445 1726882531.12078: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 18445 1726882531.12123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882531.12162: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 18445 1726882531.12167: stdout chunk (state=3): >>>import '_string' # <<< 18445 1726882531.12249: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6403730> <<< 18445 1726882531.12459: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b633e0d0> <<< 18445 1726882531.12579: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633b550> <<< 18445 1726882531.12614: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633b610> <<< 18445 1726882531.12683: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633ac40> <<< 18445 1726882531.12689: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b642eee0> <<< 18445 1726882531.12713: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 18445 1726882531.12737: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 18445 1726882531.12757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 18445 1726882531.12813: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b63bfb50> <<< 18445 1726882531.13094: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b63bd940> <<< 18445 1726882531.13118: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6331820> <<< 18445 1726882531.13147: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b63bf5b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63f7af0> <<< 18445 1726882531.13170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18445 1726882531.13185: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 18445 1726882531.13195: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.13301: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.13403: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.13419: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 18445 1726882531.13446: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18445 1726882531.13471: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 18445 1726882531.13619: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.13766: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.14513: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.15249: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 18445 1726882531.15268: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 18445 1726882531.15294: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 18445 1726882531.15317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882531.15377: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b59e5df0> <<< 18445 1726882531.15465: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5de95b0><<< 18445 1726882531.15479: stdout chunk (state=3): >>> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5dd6df0> <<< 18445 1726882531.15537: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 18445 1726882531.15570: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.15589: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 18445 1726882531.15774: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.15973: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 18445 1726882531.16003: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63b49d0> <<< 18445 1726882531.16014: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.16648: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17235: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17315: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17404: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 18445 1726882531.17419: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17456: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17505: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 18445 1726882531.17595: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17717: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 18445 1726882531.17745: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 18445 1726882531.17749: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17798: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.17846: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 18445 1726882531.18149: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.18446: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 18445 1726882531.18503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 18445 1726882531.18507: stdout chunk (state=3): >>>import '_ast' # <<< 18445 1726882531.18618: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b59b6e50> <<< 18445 1726882531.18621: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.18697: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.18808: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 18445 1726882531.18824: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 18445 1726882531.18839: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.18881: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.18939: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 18445 1726882531.18944: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.18987: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.19031: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.19160: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.19236: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 18445 1726882531.19279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 18445 1726882531.19363: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b6448910> <<< 18445 1726882531.19393: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b59b6be0> <<< 18445 1726882531.19448: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 18445 1726882531.19451: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.19623: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.19699: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.19731: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.19781: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 18445 1726882531.19801: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 18445 1726882531.19816: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 18445 1726882531.19877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 18445 1726882531.19889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 18445 1726882531.19910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 18445 1726882531.20050: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5978c70> <<< 18445 1726882531.20105: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5ddc670> <<< 18445 1726882531.20189: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5ddb850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 18445 1726882531.20221: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.20251: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 18445 1726882531.20353: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 18445 1726882531.20386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 18445 1726882531.20405: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.20574: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.20841: stdout chunk (state=3): >>># zipimport: zlib available <<< 18445 1726882531.21045: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 18445 1726882531.21058: stdout chunk (state=3): >>># destroy __main__ <<< 18445 1726882531.21437: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr<<< 18445 1726882531.21550: stdout chunk (state=3): >>> # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib <<< 18445 1726882531.21715: stdout chunk (state=3): >>># cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy <<< 18445 1726882531.21733: stdout chunk (state=3): >>># destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 18445 1726882531.21989: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 18445 1726882531.22031: stdout chunk (state=3): >>># destroy zipimport <<< 18445 1726882531.22074: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 18445 1726882531.22093: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 18445 1726882531.22143: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy array <<< 18445 1726882531.22146: stdout chunk (state=3): >>># destroy datetime <<< 18445 1726882531.22161: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 18445 1726882531.22331: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 18445 1726882531.22338: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 18445 1726882531.22343: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors <<< 18445 1726882531.22350: stdout chunk (state=3): >>># cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 18445 1726882531.22359: stdout chunk (state=3): >>># cleanup[3] wiping shutil # destroy fnmatch <<< 18445 1726882531.22362: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 18445 1726882531.22384: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref <<< 18445 1726882531.22391: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 18445 1726882531.22441: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path <<< 18445 1726882531.22455: stdout chunk (state=3): >>># destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 18445 1726882531.22462: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 18445 1726882531.22494: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 18445 1726882531.22515: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 18445 1726882531.22707: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 18445 1726882531.22748: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath <<< 18445 1726882531.22757: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 18445 1726882531.22800: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves<<< 18445 1726882531.22805: stdout chunk (state=3): >>> # destroy _operator # destroy _frozen_importlib_external # destroy _imp <<< 18445 1726882531.22807: stdout chunk (state=3): >>># destroy io # destroy marshal <<< 18445 1726882531.22835: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 18445 1726882531.23368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 18445 1726882531.23371: stdout chunk (state=3): >>><<< 18445 1726882531.23374: stderr chunk (state=3): >>><<< 18445 1726882531.23490: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6a43dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6a43b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6a43ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b678f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b678f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67b2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b678f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67f0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6787d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67b2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67d8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6753eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6756f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b674c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6752640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6753370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b66d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66d4910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66d4f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66d4fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e70d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b672ed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6727670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b673a6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b675ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b66e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b672e2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b673a2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b67609d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66b13d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66b14c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66eef40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e9a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e9490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65cb220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b669c520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6760040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ddb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65dde80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ee790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65eecd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b657c400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ddf70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b658d2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65ee610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b658d3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a8700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a89d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65a87c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a88b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65a8d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b65b3250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65a8940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b659ca90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b66e7610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b65a8af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f50b64d06d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63ce820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b645d730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b645d3a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b634e7f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b634eb80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b634e9d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b636daf0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6457d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b645d4f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b64571c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63cab20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6400eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b64008b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63672e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b64009a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b642ed00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b632fa00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6436e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633e0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6436eb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6403730> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b633e0d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633b550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633b610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b633ac40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b642eee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b63bfb50> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b63bd940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b6331820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b63bf5b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63f7af0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b59e5df0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5de95b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5dd6df0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b63b49d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b59b6e50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50b6448910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b59b6be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5978c70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5ddc670> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50b5ddb850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_gje7_0ju/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 18445 1726882531.24423: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18445 1726882531.24427: _low_level_execute_command(): starting 18445 1726882531.24429: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882530.6319895-18572-103421101224882/ > /dev/null 2>&1 && sleep 0' 18445 1726882531.25080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18445 1726882531.25099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882531.25114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882531.25132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882531.25180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882531.25197: stderr chunk (state=3): >>>debug2: match not found <<< 18445 1726882531.25215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882531.25231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18445 1726882531.25242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 18445 1726882531.25252: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18445 1726882531.25271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18445 1726882531.25284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18445 1726882531.25306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18445 1726882531.25322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 18445 1726882531.25333: stderr chunk (state=3): >>>debug2: match found <<< 18445 1726882531.25346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18445 1726882531.25436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 18445 1726882531.25462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18445 1726882531.25483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18445 1726882531.25611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18445 1726882531.28185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18445 1726882531.28189: stdout chunk (state=3): >>><<< 18445 1726882531.28192: stderr chunk (state=3): >>><<< 18445 1726882531.28599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18445 1726882531.28602: handler run complete 18445 1726882531.28605: attempt loop complete, returning result 18445 1726882531.28607: _execute() done 18445 1726882531.28609: dumping result to json 18445 1726882531.28612: done dumping result, returning 18445 1726882531.28614: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-f6eb-935c-00000000008f] 18445 1726882531.28616: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000008f 18445 1726882531.28686: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000008f 18445 1726882531.28690: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18445 1726882531.28755: no more pending results, returning what we have 18445 1726882531.28758: results queue empty 18445 1726882531.28758: checking for any_errors_fatal 18445 1726882531.28767: done checking for any_errors_fatal 18445 1726882531.28767: checking for max_fail_percentage 18445 1726882531.28769: done checking for max_fail_percentage 18445 1726882531.28769: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.28770: done checking to see if all hosts have failed 18445 1726882531.28771: getting the remaining hosts for this loop 18445 1726882531.28772: done getting the remaining hosts for this loop 18445 1726882531.28775: getting the next task for host managed_node1 18445 1726882531.28781: done getting next task for host managed_node1 18445 1726882531.28783: ^ task is: TASK: Set flag to indicate system is ostree 18445 1726882531.28785: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.28789: getting variables 18445 1726882531.28790: in VariableManager get_vars() 18445 1726882531.28816: Calling all_inventory to load vars for managed_node1 18445 1726882531.28818: Calling groups_inventory to load vars for managed_node1 18445 1726882531.28821: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.28830: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.28832: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.28834: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.28997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.29202: done with get_vars() 18445 1726882531.29212: done getting variables 18445 1726882531.29306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:35:31 -0400 (0:00:00.768) 0:00:03.230 ****** 18445 1726882531.29339: entering _queue_task() for managed_node1/set_fact 18445 1726882531.29341: Creating lock for set_fact 18445 1726882531.29621: worker is 1 (out of 1 available) 18445 1726882531.29631: exiting _queue_task() for managed_node1/set_fact 18445 1726882531.29642: done queuing things up, now waiting for results queue to drain 18445 1726882531.29643: waiting for pending results... 18445 1726882531.29881: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 18445 1726882531.29989: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000090 18445 1726882531.30006: variable 'ansible_search_path' from source: unknown 18445 1726882531.30013: variable 'ansible_search_path' from source: unknown 18445 1726882531.30048: calling self._execute() 18445 1726882531.30125: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.30136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.30148: variable 'omit' from source: magic vars 18445 1726882531.30613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18445 1726882531.30860: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18445 1726882531.30908: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18445 1726882531.30943: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18445 1726882531.30987: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18445 1726882531.31077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18445 1726882531.31107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18445 1726882531.31137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.31171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18445 1726882531.31298: Evaluated conditional (not __network_is_ostree is defined): True 18445 1726882531.31308: variable 'omit' from source: magic vars 18445 1726882531.31344: variable 'omit' from source: magic vars 18445 1726882531.31471: variable '__ostree_booted_stat' from source: set_fact 18445 1726882531.31523: variable 'omit' from source: magic vars 18445 1726882531.31549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18445 1726882531.31583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18445 1726882531.31622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18445 1726882531.31644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882531.31661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882531.31693: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18445 1726882531.31701: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.31708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.31806: Set connection var ansible_shell_type to sh 18445 1726882531.31817: Set connection var ansible_module_compression to ZIP_DEFLATED 18445 1726882531.31830: Set connection var ansible_connection to ssh 18445 1726882531.31841: Set connection var ansible_pipelining to False 18445 1726882531.31851: Set connection var ansible_shell_executable to /bin/sh 18445 1726882531.31865: Set connection var ansible_timeout to 10 18445 1726882531.31890: variable 'ansible_shell_executable' from source: unknown 18445 1726882531.31898: variable 'ansible_connection' from source: unknown 18445 1726882531.31904: variable 'ansible_module_compression' from source: unknown 18445 1726882531.31910: variable 'ansible_shell_type' from source: unknown 18445 1726882531.31916: variable 'ansible_shell_executable' from source: unknown 18445 1726882531.31921: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.31928: variable 'ansible_pipelining' from source: unknown 18445 1726882531.31937: variable 'ansible_timeout' from source: unknown 18445 1726882531.31945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.32046: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 18445 1726882531.32068: variable 'omit' from source: magic vars 18445 1726882531.32077: starting attempt loop 18445 1726882531.32084: running the handler 18445 1726882531.32098: handler run complete 18445 1726882531.32110: attempt loop complete, returning result 18445 1726882531.32115: _execute() done 18445 1726882531.32121: dumping result to json 18445 1726882531.32127: done dumping result, returning 18445 1726882531.32136: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-f6eb-935c-000000000090] 18445 1726882531.32149: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000090 18445 1726882531.32242: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000090 18445 1726882531.32249: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 18445 1726882531.32311: no more pending results, returning what we have 18445 1726882531.32313: results queue empty 18445 1726882531.32314: checking for any_errors_fatal 18445 1726882531.32322: done checking for any_errors_fatal 18445 1726882531.32323: checking for max_fail_percentage 18445 1726882531.32325: done checking for max_fail_percentage 18445 1726882531.32326: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.32327: done checking to see if all hosts have failed 18445 1726882531.32327: getting the remaining hosts for this loop 18445 1726882531.32329: done getting the remaining hosts for this loop 18445 1726882531.32332: getting the next task for host managed_node1 18445 1726882531.32341: done getting next task for host managed_node1 18445 1726882531.32343: ^ task is: TASK: Fix CentOS6 Base repo 18445 1726882531.32346: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.32350: getting variables 18445 1726882531.32352: in VariableManager get_vars() 18445 1726882531.32383: Calling all_inventory to load vars for managed_node1 18445 1726882531.32386: Calling groups_inventory to load vars for managed_node1 18445 1726882531.32390: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.32399: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.32403: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.32415: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.32586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.32782: done with get_vars() 18445 1726882531.32790: done getting variables 18445 1726882531.32905: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:35:31 -0400 (0:00:00.035) 0:00:03.266 ****** 18445 1726882531.32937: entering _queue_task() for managed_node1/copy 18445 1726882531.33345: worker is 1 (out of 1 available) 18445 1726882531.33358: exiting _queue_task() for managed_node1/copy 18445 1726882531.33371: done queuing things up, now waiting for results queue to drain 18445 1726882531.33372: waiting for pending results... 18445 1726882531.33596: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 18445 1726882531.33692: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000092 18445 1726882531.33713: variable 'ansible_search_path' from source: unknown 18445 1726882531.33720: variable 'ansible_search_path' from source: unknown 18445 1726882531.33758: calling self._execute() 18445 1726882531.33831: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.33841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.33852: variable 'omit' from source: magic vars 18445 1726882531.34291: variable 'ansible_distribution' from source: facts 18445 1726882531.34313: Evaluated conditional (ansible_distribution == 'CentOS'): True 18445 1726882531.34435: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.34445: Evaluated conditional (ansible_distribution_major_version == '6'): False 18445 1726882531.34452: when evaluation is False, skipping this task 18445 1726882531.34460: _execute() done 18445 1726882531.34471: dumping result to json 18445 1726882531.34480: done dumping result, returning 18445 1726882531.34488: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-f6eb-935c-000000000092] 18445 1726882531.34497: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000092 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18445 1726882531.34643: no more pending results, returning what we have 18445 1726882531.34646: results queue empty 18445 1726882531.34647: checking for any_errors_fatal 18445 1726882531.34652: done checking for any_errors_fatal 18445 1726882531.34653: checking for max_fail_percentage 18445 1726882531.34657: done checking for max_fail_percentage 18445 1726882531.34658: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.34659: done checking to see if all hosts have failed 18445 1726882531.34660: getting the remaining hosts for this loop 18445 1726882531.34661: done getting the remaining hosts for this loop 18445 1726882531.34666: getting the next task for host managed_node1 18445 1726882531.34673: done getting next task for host managed_node1 18445 1726882531.34676: ^ task is: TASK: Include the task 'enable_epel.yml' 18445 1726882531.34679: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.34683: getting variables 18445 1726882531.34684: in VariableManager get_vars() 18445 1726882531.34710: Calling all_inventory to load vars for managed_node1 18445 1726882531.34712: Calling groups_inventory to load vars for managed_node1 18445 1726882531.34716: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.34727: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.34730: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.34733: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.34934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.35136: done with get_vars() 18445 1726882531.35145: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:35:31 -0400 (0:00:00.023) 0:00:03.289 ****** 18445 1726882531.35246: entering _queue_task() for managed_node1/include_tasks 18445 1726882531.35267: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000092 18445 1726882531.35275: WORKER PROCESS EXITING 18445 1726882531.35610: worker is 1 (out of 1 available) 18445 1726882531.35619: exiting _queue_task() for managed_node1/include_tasks 18445 1726882531.35627: done queuing things up, now waiting for results queue to drain 18445 1726882531.35629: waiting for pending results... 18445 1726882531.35840: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 18445 1726882531.35938: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000093 18445 1726882531.35957: variable 'ansible_search_path' from source: unknown 18445 1726882531.35968: variable 'ansible_search_path' from source: unknown 18445 1726882531.36003: calling self._execute() 18445 1726882531.36073: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.36084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.36096: variable 'omit' from source: magic vars 18445 1726882531.36548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.38893: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.38959: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.39004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.39042: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.39076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.39157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.39203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.39236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.39288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.39307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.39421: variable '__network_is_ostree' from source: set_fact 18445 1726882531.39446: Evaluated conditional (not __network_is_ostree | d(false)): True 18445 1726882531.39459: _execute() done 18445 1726882531.39469: dumping result to json 18445 1726882531.39477: done dumping result, returning 18445 1726882531.39485: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-f6eb-935c-000000000093] 18445 1726882531.39494: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000093 18445 1726882531.39606: no more pending results, returning what we have 18445 1726882531.39611: in VariableManager get_vars() 18445 1726882531.39640: Calling all_inventory to load vars for managed_node1 18445 1726882531.39642: Calling groups_inventory to load vars for managed_node1 18445 1726882531.39646: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.39657: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.39662: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.39667: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.39835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.40034: done with get_vars() 18445 1726882531.40041: variable 'ansible_search_path' from source: unknown 18445 1726882531.40042: variable 'ansible_search_path' from source: unknown 18445 1726882531.40082: we have included files to process 18445 1726882531.40083: generating all_blocks data 18445 1726882531.40085: done generating all_blocks data 18445 1726882531.40095: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18445 1726882531.40097: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18445 1726882531.40100: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18445 1726882531.40594: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000093 18445 1726882531.40598: WORKER PROCESS EXITING 18445 1726882531.41018: done processing included file 18445 1726882531.41020: iterating over new_blocks loaded from include file 18445 1726882531.41022: in VariableManager get_vars() 18445 1726882531.41032: done with get_vars() 18445 1726882531.41034: filtering new block on tags 18445 1726882531.41058: done filtering new block on tags 18445 1726882531.41060: in VariableManager get_vars() 18445 1726882531.41072: done with get_vars() 18445 1726882531.41074: filtering new block on tags 18445 1726882531.41085: done filtering new block on tags 18445 1726882531.41087: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 18445 1726882531.41092: extending task lists for all hosts with included blocks 18445 1726882531.41193: done extending task lists 18445 1726882531.41194: done processing included files 18445 1726882531.41195: results queue empty 18445 1726882531.41195: checking for any_errors_fatal 18445 1726882531.41198: done checking for any_errors_fatal 18445 1726882531.41199: checking for max_fail_percentage 18445 1726882531.41200: done checking for max_fail_percentage 18445 1726882531.41201: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.41202: done checking to see if all hosts have failed 18445 1726882531.41202: getting the remaining hosts for this loop 18445 1726882531.41204: done getting the remaining hosts for this loop 18445 1726882531.41206: getting the next task for host managed_node1 18445 1726882531.41209: done getting next task for host managed_node1 18445 1726882531.41211: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 18445 1726882531.41214: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.41216: getting variables 18445 1726882531.41217: in VariableManager get_vars() 18445 1726882531.41223: Calling all_inventory to load vars for managed_node1 18445 1726882531.41225: Calling groups_inventory to load vars for managed_node1 18445 1726882531.41227: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.41232: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.41238: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.41241: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.41396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.41590: done with get_vars() 18445 1726882531.41597: done getting variables 18445 1726882531.41659: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 18445 1726882531.41848: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:35:31 -0400 (0:00:00.066) 0:00:03.355 ****** 18445 1726882531.41895: entering _queue_task() for managed_node1/command 18445 1726882531.41897: Creating lock for command 18445 1726882531.42101: worker is 1 (out of 1 available) 18445 1726882531.42110: exiting _queue_task() for managed_node1/command 18445 1726882531.42120: done queuing things up, now waiting for results queue to drain 18445 1726882531.42121: waiting for pending results... 18445 1726882531.42351: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 18445 1726882531.42456: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000000ad 18445 1726882531.42481: variable 'ansible_search_path' from source: unknown 18445 1726882531.42489: variable 'ansible_search_path' from source: unknown 18445 1726882531.42526: calling self._execute() 18445 1726882531.42602: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.42612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.42625: variable 'omit' from source: magic vars 18445 1726882531.42982: variable 'ansible_distribution' from source: facts 18445 1726882531.42997: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18445 1726882531.43134: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.43145: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18445 1726882531.43156: when evaluation is False, skipping this task 18445 1726882531.43165: _execute() done 18445 1726882531.43173: dumping result to json 18445 1726882531.43181: done dumping result, returning 18445 1726882531.43190: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-f6eb-935c-0000000000ad] 18445 1726882531.43200: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000ad 18445 1726882531.43306: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000ad 18445 1726882531.43314: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18445 1726882531.43385: no more pending results, returning what we have 18445 1726882531.43388: results queue empty 18445 1726882531.43389: checking for any_errors_fatal 18445 1726882531.43391: done checking for any_errors_fatal 18445 1726882531.43391: checking for max_fail_percentage 18445 1726882531.43393: done checking for max_fail_percentage 18445 1726882531.43394: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.43394: done checking to see if all hosts have failed 18445 1726882531.43395: getting the remaining hosts for this loop 18445 1726882531.43397: done getting the remaining hosts for this loop 18445 1726882531.43400: getting the next task for host managed_node1 18445 1726882531.43406: done getting next task for host managed_node1 18445 1726882531.43408: ^ task is: TASK: Install yum-utils package 18445 1726882531.43412: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.43416: getting variables 18445 1726882531.43418: in VariableManager get_vars() 18445 1726882531.43446: Calling all_inventory to load vars for managed_node1 18445 1726882531.43448: Calling groups_inventory to load vars for managed_node1 18445 1726882531.43451: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.43467: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.43471: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.43474: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.43630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.43827: done with get_vars() 18445 1726882531.43836: done getting variables 18445 1726882531.43924: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:35:31 -0400 (0:00:00.020) 0:00:03.376 ****** 18445 1726882531.43950: entering _queue_task() for managed_node1/package 18445 1726882531.43951: Creating lock for package 18445 1726882531.44156: worker is 1 (out of 1 available) 18445 1726882531.44168: exiting _queue_task() for managed_node1/package 18445 1726882531.44179: done queuing things up, now waiting for results queue to drain 18445 1726882531.44180: waiting for pending results... 18445 1726882531.44401: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 18445 1726882531.44511: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000000ae 18445 1726882531.44529: variable 'ansible_search_path' from source: unknown 18445 1726882531.44536: variable 'ansible_search_path' from source: unknown 18445 1726882531.44576: calling self._execute() 18445 1726882531.44717: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.44727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.44743: variable 'omit' from source: magic vars 18445 1726882531.45089: variable 'ansible_distribution' from source: facts 18445 1726882531.45105: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18445 1726882531.45238: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.45248: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18445 1726882531.45258: when evaluation is False, skipping this task 18445 1726882531.45266: _execute() done 18445 1726882531.45272: dumping result to json 18445 1726882531.45284: done dumping result, returning 18445 1726882531.45294: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-f6eb-935c-0000000000ae] 18445 1726882531.45303: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000ae skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18445 1726882531.45485: no more pending results, returning what we have 18445 1726882531.45488: results queue empty 18445 1726882531.45489: checking for any_errors_fatal 18445 1726882531.45493: done checking for any_errors_fatal 18445 1726882531.45493: checking for max_fail_percentage 18445 1726882531.45495: done checking for max_fail_percentage 18445 1726882531.45496: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.45497: done checking to see if all hosts have failed 18445 1726882531.45497: getting the remaining hosts for this loop 18445 1726882531.45499: done getting the remaining hosts for this loop 18445 1726882531.45502: getting the next task for host managed_node1 18445 1726882531.45508: done getting next task for host managed_node1 18445 1726882531.45511: ^ task is: TASK: Enable EPEL 7 18445 1726882531.45515: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.45518: getting variables 18445 1726882531.45519: in VariableManager get_vars() 18445 1726882531.45541: Calling all_inventory to load vars for managed_node1 18445 1726882531.45543: Calling groups_inventory to load vars for managed_node1 18445 1726882531.45547: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.45560: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.45565: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.45568: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.45730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.45934: done with get_vars() 18445 1726882531.45943: done getting variables 18445 1726882531.46013: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:35:31 -0400 (0:00:00.020) 0:00:03.397 ****** 18445 1726882531.46043: entering _queue_task() for managed_node1/command 18445 1726882531.46062: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000ae 18445 1726882531.46072: WORKER PROCESS EXITING 18445 1726882531.46412: worker is 1 (out of 1 available) 18445 1726882531.46422: exiting _queue_task() for managed_node1/command 18445 1726882531.46433: done queuing things up, now waiting for results queue to drain 18445 1726882531.46434: waiting for pending results... 18445 1726882531.46649: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 18445 1726882531.46758: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000000af 18445 1726882531.46781: variable 'ansible_search_path' from source: unknown 18445 1726882531.46789: variable 'ansible_search_path' from source: unknown 18445 1726882531.46825: calling self._execute() 18445 1726882531.46891: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.46899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.46911: variable 'omit' from source: magic vars 18445 1726882531.47260: variable 'ansible_distribution' from source: facts 18445 1726882531.47280: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18445 1726882531.47412: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.47428: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18445 1726882531.47435: when evaluation is False, skipping this task 18445 1726882531.47442: _execute() done 18445 1726882531.47448: dumping result to json 18445 1726882531.47458: done dumping result, returning 18445 1726882531.47470: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-f6eb-935c-0000000000af] 18445 1726882531.47480: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000af skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18445 1726882531.47618: no more pending results, returning what we have 18445 1726882531.47621: results queue empty 18445 1726882531.47622: checking for any_errors_fatal 18445 1726882531.47628: done checking for any_errors_fatal 18445 1726882531.47628: checking for max_fail_percentage 18445 1726882531.47630: done checking for max_fail_percentage 18445 1726882531.47631: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.47632: done checking to see if all hosts have failed 18445 1726882531.47633: getting the remaining hosts for this loop 18445 1726882531.47635: done getting the remaining hosts for this loop 18445 1726882531.47638: getting the next task for host managed_node1 18445 1726882531.47645: done getting next task for host managed_node1 18445 1726882531.47647: ^ task is: TASK: Enable EPEL 8 18445 1726882531.47652: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.47658: getting variables 18445 1726882531.47660: in VariableManager get_vars() 18445 1726882531.47688: Calling all_inventory to load vars for managed_node1 18445 1726882531.47691: Calling groups_inventory to load vars for managed_node1 18445 1726882531.47695: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.47706: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.47710: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.47714: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.47893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.48115: done with get_vars() 18445 1726882531.48124: done getting variables 18445 1726882531.48198: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:35:31 -0400 (0:00:00.021) 0:00:03.419 ****** 18445 1726882531.48230: entering _queue_task() for managed_node1/command 18445 1726882531.48248: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000af 18445 1726882531.48261: WORKER PROCESS EXITING 18445 1726882531.48602: worker is 1 (out of 1 available) 18445 1726882531.48610: exiting _queue_task() for managed_node1/command 18445 1726882531.48621: done queuing things up, now waiting for results queue to drain 18445 1726882531.48622: waiting for pending results... 18445 1726882531.48851: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 18445 1726882531.48961: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000000b0 18445 1726882531.48983: variable 'ansible_search_path' from source: unknown 18445 1726882531.48990: variable 'ansible_search_path' from source: unknown 18445 1726882531.49026: calling self._execute() 18445 1726882531.49101: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.49112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.49125: variable 'omit' from source: magic vars 18445 1726882531.49486: variable 'ansible_distribution' from source: facts 18445 1726882531.49501: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18445 1726882531.49635: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.49646: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18445 1726882531.49653: when evaluation is False, skipping this task 18445 1726882531.49662: _execute() done 18445 1726882531.49671: dumping result to json 18445 1726882531.49678: done dumping result, returning 18445 1726882531.49686: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-f6eb-935c-0000000000b0] 18445 1726882531.49694: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000b0 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18445 1726882531.49822: no more pending results, returning what we have 18445 1726882531.49825: results queue empty 18445 1726882531.49826: checking for any_errors_fatal 18445 1726882531.49833: done checking for any_errors_fatal 18445 1726882531.49834: checking for max_fail_percentage 18445 1726882531.49836: done checking for max_fail_percentage 18445 1726882531.49837: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.49838: done checking to see if all hosts have failed 18445 1726882531.49838: getting the remaining hosts for this loop 18445 1726882531.49840: done getting the remaining hosts for this loop 18445 1726882531.49843: getting the next task for host managed_node1 18445 1726882531.49852: done getting next task for host managed_node1 18445 1726882531.49857: ^ task is: TASK: Enable EPEL 6 18445 1726882531.49862: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.49867: getting variables 18445 1726882531.49869: in VariableManager get_vars() 18445 1726882531.49896: Calling all_inventory to load vars for managed_node1 18445 1726882531.49899: Calling groups_inventory to load vars for managed_node1 18445 1726882531.49903: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.49914: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.49918: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.49922: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.50097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.50325: done with get_vars() 18445 1726882531.50333: done getting variables 18445 1726882531.50389: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000b0 18445 1726882531.50392: WORKER PROCESS EXITING 18445 1726882531.50426: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:35:31 -0400 (0:00:00.022) 0:00:03.441 ****** 18445 1726882531.50451: entering _queue_task() for managed_node1/copy 18445 1726882531.50609: worker is 1 (out of 1 available) 18445 1726882531.50621: exiting _queue_task() for managed_node1/copy 18445 1726882531.50631: done queuing things up, now waiting for results queue to drain 18445 1726882531.50633: waiting for pending results... 18445 1726882531.50776: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 18445 1726882531.50834: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000000b2 18445 1726882531.50844: variable 'ansible_search_path' from source: unknown 18445 1726882531.50848: variable 'ansible_search_path' from source: unknown 18445 1726882531.50874: calling self._execute() 18445 1726882531.50920: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.50923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.50932: variable 'omit' from source: magic vars 18445 1726882531.51171: variable 'ansible_distribution' from source: facts 18445 1726882531.51182: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18445 1726882531.51256: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.51260: Evaluated conditional (ansible_distribution_major_version == '6'): False 18445 1726882531.51265: when evaluation is False, skipping this task 18445 1726882531.51267: _execute() done 18445 1726882531.51270: dumping result to json 18445 1726882531.51274: done dumping result, returning 18445 1726882531.51277: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-f6eb-935c-0000000000b2] 18445 1726882531.51279: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000b2 18445 1726882531.51365: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000b2 18445 1726882531.51369: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18445 1726882531.51431: no more pending results, returning what we have 18445 1726882531.51434: results queue empty 18445 1726882531.51434: checking for any_errors_fatal 18445 1726882531.51436: done checking for any_errors_fatal 18445 1726882531.51437: checking for max_fail_percentage 18445 1726882531.51438: done checking for max_fail_percentage 18445 1726882531.51438: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.51438: done checking to see if all hosts have failed 18445 1726882531.51439: getting the remaining hosts for this loop 18445 1726882531.51440: done getting the remaining hosts for this loop 18445 1726882531.51442: getting the next task for host managed_node1 18445 1726882531.51447: done getting next task for host managed_node1 18445 1726882531.51448: ^ task is: TASK: Set network provider to 'initscripts' 18445 1726882531.51450: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.51452: getting variables 18445 1726882531.51453: in VariableManager get_vars() 18445 1726882531.51475: Calling all_inventory to load vars for managed_node1 18445 1726882531.51477: Calling groups_inventory to load vars for managed_node1 18445 1726882531.51479: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.51484: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.51486: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.51488: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.51617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.51725: done with get_vars() 18445 1726882531.51731: done getting variables 18445 1726882531.51769: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'initscripts'] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:12 Friday 20 September 2024 21:35:31 -0400 (0:00:00.013) 0:00:03.454 ****** 18445 1726882531.51786: entering _queue_task() for managed_node1/set_fact 18445 1726882531.51925: worker is 1 (out of 1 available) 18445 1726882531.51935: exiting _queue_task() for managed_node1/set_fact 18445 1726882531.51944: done queuing things up, now waiting for results queue to drain 18445 1726882531.51946: waiting for pending results... 18445 1726882531.52108: running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' 18445 1726882531.52192: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000007 18445 1726882531.52196: variable 'ansible_search_path' from source: unknown 18445 1726882531.52276: calling self._execute() 18445 1726882531.52395: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.52399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.52401: variable 'omit' from source: magic vars 18445 1726882531.52432: variable 'omit' from source: magic vars 18445 1726882531.52466: variable 'omit' from source: magic vars 18445 1726882531.52512: variable 'omit' from source: magic vars 18445 1726882531.52552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18445 1726882531.52596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18445 1726882531.52624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18445 1726882531.52641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882531.52653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18445 1726882531.52687: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18445 1726882531.52691: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.52698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.52799: Set connection var ansible_shell_type to sh 18445 1726882531.52811: Set connection var ansible_module_compression to ZIP_DEFLATED 18445 1726882531.52821: Set connection var ansible_connection to ssh 18445 1726882531.52834: Set connection var ansible_pipelining to False 18445 1726882531.52840: Set connection var ansible_shell_executable to /bin/sh 18445 1726882531.52846: Set connection var ansible_timeout to 10 18445 1726882531.52871: variable 'ansible_shell_executable' from source: unknown 18445 1726882531.52874: variable 'ansible_connection' from source: unknown 18445 1726882531.52877: variable 'ansible_module_compression' from source: unknown 18445 1726882531.52879: variable 'ansible_shell_type' from source: unknown 18445 1726882531.52881: variable 'ansible_shell_executable' from source: unknown 18445 1726882531.52884: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.52888: variable 'ansible_pipelining' from source: unknown 18445 1726882531.52891: variable 'ansible_timeout' from source: unknown 18445 1726882531.52895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.53046: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 18445 1726882531.53056: variable 'omit' from source: magic vars 18445 1726882531.53067: starting attempt loop 18445 1726882531.53070: running the handler 18445 1726882531.53081: handler run complete 18445 1726882531.53091: attempt loop complete, returning result 18445 1726882531.53094: _execute() done 18445 1726882531.53096: dumping result to json 18445 1726882531.53099: done dumping result, returning 18445 1726882531.53105: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' [0e448fcc-3ce9-f6eb-935c-000000000007] 18445 1726882531.53110: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000007 18445 1726882531.53194: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000007 18445 1726882531.53196: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "initscripts" }, "changed": false } 18445 1726882531.53241: no more pending results, returning what we have 18445 1726882531.53244: results queue empty 18445 1726882531.53245: checking for any_errors_fatal 18445 1726882531.53250: done checking for any_errors_fatal 18445 1726882531.53251: checking for max_fail_percentage 18445 1726882531.53252: done checking for max_fail_percentage 18445 1726882531.53252: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.53253: done checking to see if all hosts have failed 18445 1726882531.53254: getting the remaining hosts for this loop 18445 1726882531.53255: done getting the remaining hosts for this loop 18445 1726882531.53258: getting the next task for host managed_node1 18445 1726882531.53267: done getting next task for host managed_node1 18445 1726882531.53269: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.53271: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.53274: getting variables 18445 1726882531.53275: in VariableManager get_vars() 18445 1726882531.53296: Calling all_inventory to load vars for managed_node1 18445 1726882531.53299: Calling groups_inventory to load vars for managed_node1 18445 1726882531.53301: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.53308: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.53311: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.53314: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.53490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.53637: done with get_vars() 18445 1726882531.53646: done getting variables 18445 1726882531.53698: in VariableManager get_vars() 18445 1726882531.53704: Calling all_inventory to load vars for managed_node1 18445 1726882531.53705: Calling groups_inventory to load vars for managed_node1 18445 1726882531.53707: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.53709: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.53711: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.53712: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.53792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.53922: done with get_vars() 18445 1726882531.53931: done queuing things up, now waiting for results queue to drain 18445 1726882531.53932: results queue empty 18445 1726882531.53932: checking for any_errors_fatal 18445 1726882531.53934: done checking for any_errors_fatal 18445 1726882531.53934: checking for max_fail_percentage 18445 1726882531.53935: done checking for max_fail_percentage 18445 1726882531.53935: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.53936: done checking to see if all hosts have failed 18445 1726882531.53936: getting the remaining hosts for this loop 18445 1726882531.53937: done getting the remaining hosts for this loop 18445 1726882531.53938: getting the next task for host managed_node1 18445 1726882531.53940: done getting next task for host managed_node1 18445 1726882531.53941: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.53942: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.53947: getting variables 18445 1726882531.53947: in VariableManager get_vars() 18445 1726882531.53952: Calling all_inventory to load vars for managed_node1 18445 1726882531.53953: Calling groups_inventory to load vars for managed_node1 18445 1726882531.53956: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.53959: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.53961: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.53962: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.54041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.54147: done with get_vars() 18445 1726882531.54152: done getting variables 18445 1726882531.54182: in VariableManager get_vars() 18445 1726882531.54188: Calling all_inventory to load vars for managed_node1 18445 1726882531.54189: Calling groups_inventory to load vars for managed_node1 18445 1726882531.54191: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.54194: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.54195: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.54197: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.54279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.54386: done with get_vars() 18445 1726882531.54393: done queuing things up, now waiting for results queue to drain 18445 1726882531.54394: results queue empty 18445 1726882531.54395: checking for any_errors_fatal 18445 1726882531.54395: done checking for any_errors_fatal 18445 1726882531.54396: checking for max_fail_percentage 18445 1726882531.54397: done checking for max_fail_percentage 18445 1726882531.54397: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.54397: done checking to see if all hosts have failed 18445 1726882531.54398: getting the remaining hosts for this loop 18445 1726882531.54398: done getting the remaining hosts for this loop 18445 1726882531.54400: getting the next task for host managed_node1 18445 1726882531.54401: done getting next task for host managed_node1 18445 1726882531.54402: ^ task is: None 18445 1726882531.54403: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.54403: done queuing things up, now waiting for results queue to drain 18445 1726882531.54404: results queue empty 18445 1726882531.54404: checking for any_errors_fatal 18445 1726882531.54405: done checking for any_errors_fatal 18445 1726882531.54405: checking for max_fail_percentage 18445 1726882531.54406: done checking for max_fail_percentage 18445 1726882531.54406: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.54407: done checking to see if all hosts have failed 18445 1726882531.54408: getting the next task for host managed_node1 18445 1726882531.54409: done getting next task for host managed_node1 18445 1726882531.54409: ^ task is: None 18445 1726882531.54410: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.54444: in VariableManager get_vars() 18445 1726882531.54457: done with get_vars() 18445 1726882531.54461: in VariableManager get_vars() 18445 1726882531.54469: done with get_vars() 18445 1726882531.54472: variable 'omit' from source: magic vars 18445 1726882531.54490: in VariableManager get_vars() 18445 1726882531.54496: done with get_vars() 18445 1726882531.54507: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 18445 1726882531.54765: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882531.54785: getting the remaining hosts for this loop 18445 1726882531.54787: done getting the remaining hosts for this loop 18445 1726882531.54788: getting the next task for host managed_node1 18445 1726882531.54790: done getting next task for host managed_node1 18445 1726882531.54791: ^ task is: TASK: Gathering Facts 18445 1726882531.54792: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.54793: getting variables 18445 1726882531.54794: in VariableManager get_vars() 18445 1726882531.54798: Calling all_inventory to load vars for managed_node1 18445 1726882531.54800: Calling groups_inventory to load vars for managed_node1 18445 1726882531.54801: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.54804: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.54812: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.54814: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.54896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.54999: done with get_vars() 18445 1726882531.55004: done getting variables 18445 1726882531.55027: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 21:35:31 -0400 (0:00:00.032) 0:00:03.487 ****** 18445 1726882531.55041: entering _queue_task() for managed_node1/gather_facts 18445 1726882531.55181: worker is 1 (out of 1 available) 18445 1726882531.55190: exiting _queue_task() for managed_node1/gather_facts 18445 1726882531.55199: done queuing things up, now waiting for results queue to drain 18445 1726882531.55201: waiting for pending results... 18445 1726882531.55332: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882531.55386: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000000d8 18445 1726882531.55396: variable 'ansible_search_path' from source: unknown 18445 1726882531.55421: calling self._execute() 18445 1726882531.55477: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.55481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.55488: variable 'omit' from source: magic vars 18445 1726882531.55927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.58110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.58151: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.58191: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.58215: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.58235: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.58297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.58317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.58334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.58360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.58375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.58470: variable 'ansible_distribution' from source: facts 18445 1726882531.58474: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.58492: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.58501: when evaluation is False, skipping this task 18445 1726882531.58504: _execute() done 18445 1726882531.58507: dumping result to json 18445 1726882531.58510: done dumping result, returning 18445 1726882531.58516: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-0000000000d8] 18445 1726882531.58521: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000d8 18445 1726882531.58591: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000d8 18445 1726882531.58595: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.58635: no more pending results, returning what we have 18445 1726882531.58637: results queue empty 18445 1726882531.58638: checking for any_errors_fatal 18445 1726882531.58639: done checking for any_errors_fatal 18445 1726882531.58640: checking for max_fail_percentage 18445 1726882531.58641: done checking for max_fail_percentage 18445 1726882531.58642: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.58643: done checking to see if all hosts have failed 18445 1726882531.58643: getting the remaining hosts for this loop 18445 1726882531.58645: done getting the remaining hosts for this loop 18445 1726882531.58648: getting the next task for host managed_node1 18445 1726882531.58656: done getting next task for host managed_node1 18445 1726882531.58658: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.58659: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.58663: getting variables 18445 1726882531.58666: in VariableManager get_vars() 18445 1726882531.58688: Calling all_inventory to load vars for managed_node1 18445 1726882531.58691: Calling groups_inventory to load vars for managed_node1 18445 1726882531.58694: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.58702: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.58705: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.58707: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.58844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.58965: done with get_vars() 18445 1726882531.58972: done getting variables 18445 1726882531.59013: in VariableManager get_vars() 18445 1726882531.59019: Calling all_inventory to load vars for managed_node1 18445 1726882531.59020: Calling groups_inventory to load vars for managed_node1 18445 1726882531.59022: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.59024: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.59026: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.59027: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.59109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.59216: done with get_vars() 18445 1726882531.59224: done queuing things up, now waiting for results queue to drain 18445 1726882531.59225: results queue empty 18445 1726882531.59226: checking for any_errors_fatal 18445 1726882531.59227: done checking for any_errors_fatal 18445 1726882531.59228: checking for max_fail_percentage 18445 1726882531.59228: done checking for max_fail_percentage 18445 1726882531.59229: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.59229: done checking to see if all hosts have failed 18445 1726882531.59230: getting the remaining hosts for this loop 18445 1726882531.59230: done getting the remaining hosts for this loop 18445 1726882531.59232: getting the next task for host managed_node1 18445 1726882531.59234: done getting next task for host managed_node1 18445 1726882531.59235: ^ task is: TASK: Show inside ethernet tests 18445 1726882531.59236: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.59237: getting variables 18445 1726882531.59238: in VariableManager get_vars() 18445 1726882531.59242: Calling all_inventory to load vars for managed_node1 18445 1726882531.59243: Calling groups_inventory to load vars for managed_node1 18445 1726882531.59244: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.59251: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.59252: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.59256: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.59356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.59509: done with get_vars() 18445 1726882531.59517: done getting variables 18445 1726882531.59570: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 21:35:31 -0400 (0:00:00.045) 0:00:03.532 ****** 18445 1726882531.59593: entering _queue_task() for managed_node1/debug 18445 1726882531.59595: Creating lock for debug 18445 1726882531.59787: worker is 1 (out of 1 available) 18445 1726882531.59798: exiting _queue_task() for managed_node1/debug 18445 1726882531.59808: done queuing things up, now waiting for results queue to drain 18445 1726882531.59809: waiting for pending results... 18445 1726882531.60044: running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests 18445 1726882531.60131: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000000b 18445 1726882531.60147: variable 'ansible_search_path' from source: unknown 18445 1726882531.60193: calling self._execute() 18445 1726882531.60278: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.60289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.60302: variable 'omit' from source: magic vars 18445 1726882531.60740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.62936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.63021: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.63072: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.63114: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.63144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.63239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.63285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.63320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.63368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.63396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.63560: variable 'ansible_distribution' from source: facts 18445 1726882531.63578: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.63594: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.63597: when evaluation is False, skipping this task 18445 1726882531.63600: _execute() done 18445 1726882531.63602: dumping result to json 18445 1726882531.63604: done dumping result, returning 18445 1726882531.63609: done running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests [0e448fcc-3ce9-f6eb-935c-00000000000b] 18445 1726882531.63619: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000000b 18445 1726882531.63725: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000000b 18445 1726882531.63728: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882531.63788: no more pending results, returning what we have 18445 1726882531.63791: results queue empty 18445 1726882531.63791: checking for any_errors_fatal 18445 1726882531.63794: done checking for any_errors_fatal 18445 1726882531.63794: checking for max_fail_percentage 18445 1726882531.63796: done checking for max_fail_percentage 18445 1726882531.63796: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.63797: done checking to see if all hosts have failed 18445 1726882531.63798: getting the remaining hosts for this loop 18445 1726882531.63799: done getting the remaining hosts for this loop 18445 1726882531.63803: getting the next task for host managed_node1 18445 1726882531.63808: done getting next task for host managed_node1 18445 1726882531.63811: ^ task is: TASK: Show network_provider 18445 1726882531.63813: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.63816: getting variables 18445 1726882531.63817: in VariableManager get_vars() 18445 1726882531.63850: Calling all_inventory to load vars for managed_node1 18445 1726882531.63855: Calling groups_inventory to load vars for managed_node1 18445 1726882531.63859: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.63874: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.63877: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.63880: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.64009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.64124: done with get_vars() 18445 1726882531.64131: done getting variables 18445 1726882531.64178: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 21:35:31 -0400 (0:00:00.046) 0:00:03.579 ****** 18445 1726882531.64199: entering _queue_task() for managed_node1/debug 18445 1726882531.64381: worker is 1 (out of 1 available) 18445 1726882531.64392: exiting _queue_task() for managed_node1/debug 18445 1726882531.64403: done queuing things up, now waiting for results queue to drain 18445 1726882531.64405: waiting for pending results... 18445 1726882531.64556: running TaskExecutor() for managed_node1/TASK: Show network_provider 18445 1726882531.64606: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000000c 18445 1726882531.64615: variable 'ansible_search_path' from source: unknown 18445 1726882531.64649: calling self._execute() 18445 1726882531.64708: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.64714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.64722: variable 'omit' from source: magic vars 18445 1726882531.65031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.67175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.67221: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.67257: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.67285: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.67304: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.67367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.67387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.67405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.67431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.67444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.67541: variable 'ansible_distribution' from source: facts 18445 1726882531.67548: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.67567: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.67574: when evaluation is False, skipping this task 18445 1726882531.67577: _execute() done 18445 1726882531.67579: dumping result to json 18445 1726882531.67583: done dumping result, returning 18445 1726882531.67589: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0e448fcc-3ce9-f6eb-935c-00000000000c] 18445 1726882531.67594: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000000c 18445 1726882531.67672: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000000c 18445 1726882531.67675: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882531.67714: no more pending results, returning what we have 18445 1726882531.67717: results queue empty 18445 1726882531.67718: checking for any_errors_fatal 18445 1726882531.67724: done checking for any_errors_fatal 18445 1726882531.67724: checking for max_fail_percentage 18445 1726882531.67726: done checking for max_fail_percentage 18445 1726882531.67727: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.67727: done checking to see if all hosts have failed 18445 1726882531.67728: getting the remaining hosts for this loop 18445 1726882531.67730: done getting the remaining hosts for this loop 18445 1726882531.67733: getting the next task for host managed_node1 18445 1726882531.67739: done getting next task for host managed_node1 18445 1726882531.67741: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.67743: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.67746: getting variables 18445 1726882531.67748: in VariableManager get_vars() 18445 1726882531.67779: Calling all_inventory to load vars for managed_node1 18445 1726882531.67782: Calling groups_inventory to load vars for managed_node1 18445 1726882531.67786: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.67795: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.67798: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.67801: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.67968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.68080: done with get_vars() 18445 1726882531.68087: done getting variables 18445 1726882531.68133: in VariableManager get_vars() 18445 1726882531.68139: Calling all_inventory to load vars for managed_node1 18445 1726882531.68140: Calling groups_inventory to load vars for managed_node1 18445 1726882531.68142: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.68145: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.68146: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.68148: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.68232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.68339: done with get_vars() 18445 1726882531.68347: done queuing things up, now waiting for results queue to drain 18445 1726882531.68349: results queue empty 18445 1726882531.68349: checking for any_errors_fatal 18445 1726882531.68351: done checking for any_errors_fatal 18445 1726882531.68351: checking for max_fail_percentage 18445 1726882531.68352: done checking for max_fail_percentage 18445 1726882531.68352: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.68353: done checking to see if all hosts have failed 18445 1726882531.68353: getting the remaining hosts for this loop 18445 1726882531.68356: done getting the remaining hosts for this loop 18445 1726882531.68357: getting the next task for host managed_node1 18445 1726882531.68360: done getting next task for host managed_node1 18445 1726882531.68361: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.68361: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.68364: getting variables 18445 1726882531.68365: in VariableManager get_vars() 18445 1726882531.68370: Calling all_inventory to load vars for managed_node1 18445 1726882531.68371: Calling groups_inventory to load vars for managed_node1 18445 1726882531.68372: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.68375: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.68381: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.68383: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.68462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.68569: done with get_vars() 18445 1726882531.68575: done getting variables 18445 1726882531.68604: in VariableManager get_vars() 18445 1726882531.68608: Calling all_inventory to load vars for managed_node1 18445 1726882531.68610: Calling groups_inventory to load vars for managed_node1 18445 1726882531.68611: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.68614: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.68616: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.68618: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.68717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.68822: done with get_vars() 18445 1726882531.68829: done queuing things up, now waiting for results queue to drain 18445 1726882531.68830: results queue empty 18445 1726882531.68831: checking for any_errors_fatal 18445 1726882531.68832: done checking for any_errors_fatal 18445 1726882531.68833: checking for max_fail_percentage 18445 1726882531.68833: done checking for max_fail_percentage 18445 1726882531.68834: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.68835: done checking to see if all hosts have failed 18445 1726882531.68835: getting the remaining hosts for this loop 18445 1726882531.68836: done getting the remaining hosts for this loop 18445 1726882531.68838: getting the next task for host managed_node1 18445 1726882531.68839: done getting next task for host managed_node1 18445 1726882531.68840: ^ task is: None 18445 1726882531.68841: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.68842: done queuing things up, now waiting for results queue to drain 18445 1726882531.68842: results queue empty 18445 1726882531.68843: checking for any_errors_fatal 18445 1726882531.68843: done checking for any_errors_fatal 18445 1726882531.68844: checking for max_fail_percentage 18445 1726882531.68845: done checking for max_fail_percentage 18445 1726882531.68845: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.68846: done checking to see if all hosts have failed 18445 1726882531.68847: getting the next task for host managed_node1 18445 1726882531.68849: done getting next task for host managed_node1 18445 1726882531.68849: ^ task is: None 18445 1726882531.68850: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.68883: in VariableManager get_vars() 18445 1726882531.68893: done with get_vars() 18445 1726882531.68896: in VariableManager get_vars() 18445 1726882531.68902: done with get_vars() 18445 1726882531.68904: variable 'omit' from source: magic vars 18445 1726882531.68923: in VariableManager get_vars() 18445 1726882531.68928: done with get_vars() 18445 1726882531.68940: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 18445 1726882531.69057: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882531.69080: getting the remaining hosts for this loop 18445 1726882531.69081: done getting the remaining hosts for this loop 18445 1726882531.69083: getting the next task for host managed_node1 18445 1726882531.69085: done getting next task for host managed_node1 18445 1726882531.69086: ^ task is: TASK: Gathering Facts 18445 1726882531.69087: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.69088: getting variables 18445 1726882531.69088: in VariableManager get_vars() 18445 1726882531.69094: Calling all_inventory to load vars for managed_node1 18445 1726882531.69095: Calling groups_inventory to load vars for managed_node1 18445 1726882531.69097: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.69099: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.69101: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.69102: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.69203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.69309: done with get_vars() 18445 1726882531.69313: done getting variables 18445 1726882531.69337: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 21:35:31 -0400 (0:00:00.051) 0:00:03.630 ****** 18445 1726882531.69352: entering _queue_task() for managed_node1/gather_facts 18445 1726882531.69534: worker is 1 (out of 1 available) 18445 1726882531.69544: exiting _queue_task() for managed_node1/gather_facts 18445 1726882531.69557: done queuing things up, now waiting for results queue to drain 18445 1726882531.69559: waiting for pending results... 18445 1726882531.69699: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882531.69758: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000000f0 18445 1726882531.69769: variable 'ansible_search_path' from source: unknown 18445 1726882531.69796: calling self._execute() 18445 1726882531.69852: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.69859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.69865: variable 'omit' from source: magic vars 18445 1726882531.70152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.71727: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.71778: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.71803: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.71828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.71847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.71910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.71929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.71946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.71980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.71991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.72088: variable 'ansible_distribution' from source: facts 18445 1726882531.72096: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.72112: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.72118: when evaluation is False, skipping this task 18445 1726882531.72122: _execute() done 18445 1726882531.72124: dumping result to json 18445 1726882531.72127: done dumping result, returning 18445 1726882531.72133: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-0000000000f0] 18445 1726882531.72138: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000f0 18445 1726882531.72207: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000000f0 18445 1726882531.72211: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.72256: no more pending results, returning what we have 18445 1726882531.72260: results queue empty 18445 1726882531.72261: checking for any_errors_fatal 18445 1726882531.72262: done checking for any_errors_fatal 18445 1726882531.72262: checking for max_fail_percentage 18445 1726882531.72266: done checking for max_fail_percentage 18445 1726882531.72267: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.72268: done checking to see if all hosts have failed 18445 1726882531.72269: getting the remaining hosts for this loop 18445 1726882531.72270: done getting the remaining hosts for this loop 18445 1726882531.72273: getting the next task for host managed_node1 18445 1726882531.72278: done getting next task for host managed_node1 18445 1726882531.72280: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.72282: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.72285: getting variables 18445 1726882531.72287: in VariableManager get_vars() 18445 1726882531.72312: Calling all_inventory to load vars for managed_node1 18445 1726882531.72314: Calling groups_inventory to load vars for managed_node1 18445 1726882531.72317: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.72331: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.72334: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.72337: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.72470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.72583: done with get_vars() 18445 1726882531.72589: done getting variables 18445 1726882531.72632: in VariableManager get_vars() 18445 1726882531.72638: Calling all_inventory to load vars for managed_node1 18445 1726882531.72640: Calling groups_inventory to load vars for managed_node1 18445 1726882531.72641: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.72644: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.72646: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.72649: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.72748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.72859: done with get_vars() 18445 1726882531.72870: done queuing things up, now waiting for results queue to drain 18445 1726882531.72872: results queue empty 18445 1726882531.72873: checking for any_errors_fatal 18445 1726882531.72874: done checking for any_errors_fatal 18445 1726882531.72875: checking for max_fail_percentage 18445 1726882531.72876: done checking for max_fail_percentage 18445 1726882531.72876: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.72876: done checking to see if all hosts have failed 18445 1726882531.72877: getting the remaining hosts for this loop 18445 1726882531.72878: done getting the remaining hosts for this loop 18445 1726882531.72879: getting the next task for host managed_node1 18445 1726882531.72881: done getting next task for host managed_node1 18445 1726882531.72883: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 18445 1726882531.72884: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.72885: getting variables 18445 1726882531.72885: in VariableManager get_vars() 18445 1726882531.72890: Calling all_inventory to load vars for managed_node1 18445 1726882531.72892: Calling groups_inventory to load vars for managed_node1 18445 1726882531.72893: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.72900: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.72901: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.72903: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.72984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.73090: done with get_vars() 18445 1726882531.73095: done getting variables 18445 1726882531.73120: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882531.73220: variable 'type' from source: play vars 18445 1726882531.73223: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 21:35:31 -0400 (0:00:00.038) 0:00:03.669 ****** 18445 1726882531.73249: entering _queue_task() for managed_node1/set_fact 18445 1726882531.73433: worker is 1 (out of 1 available) 18445 1726882531.73445: exiting _queue_task() for managed_node1/set_fact 18445 1726882531.73461: done queuing things up, now waiting for results queue to drain 18445 1726882531.73462: waiting for pending results... 18445 1726882531.73605: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 18445 1726882531.73662: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000000f 18445 1726882531.73673: variable 'ansible_search_path' from source: unknown 18445 1726882531.73700: calling self._execute() 18445 1726882531.73750: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.73756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.73763: variable 'omit' from source: magic vars 18445 1726882531.74042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.75572: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.75617: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.75644: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.75674: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.75700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.75756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.75778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.75798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.75826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.75837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.75934: variable 'ansible_distribution' from source: facts 18445 1726882531.75937: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.75952: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.75955: when evaluation is False, skipping this task 18445 1726882531.75958: _execute() done 18445 1726882531.75961: dumping result to json 18445 1726882531.75971: done dumping result, returning 18445 1726882531.75974: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 [0e448fcc-3ce9-f6eb-935c-00000000000f] 18445 1726882531.75976: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000000f 18445 1726882531.76047: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000000f 18445 1726882531.76050: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.76126: no more pending results, returning what we have 18445 1726882531.76129: results queue empty 18445 1726882531.76129: checking for any_errors_fatal 18445 1726882531.76131: done checking for any_errors_fatal 18445 1726882531.76132: checking for max_fail_percentage 18445 1726882531.76133: done checking for max_fail_percentage 18445 1726882531.76133: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.76134: done checking to see if all hosts have failed 18445 1726882531.76135: getting the remaining hosts for this loop 18445 1726882531.76136: done getting the remaining hosts for this loop 18445 1726882531.76139: getting the next task for host managed_node1 18445 1726882531.76143: done getting next task for host managed_node1 18445 1726882531.76146: ^ task is: TASK: Include the task 'show_interfaces.yml' 18445 1726882531.76148: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.76150: getting variables 18445 1726882531.76152: in VariableManager get_vars() 18445 1726882531.76182: Calling all_inventory to load vars for managed_node1 18445 1726882531.76184: Calling groups_inventory to load vars for managed_node1 18445 1726882531.76187: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.76194: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.76195: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.76197: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.76301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.76412: done with get_vars() 18445 1726882531.76418: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 21:35:31 -0400 (0:00:00.032) 0:00:03.701 ****** 18445 1726882531.76477: entering _queue_task() for managed_node1/include_tasks 18445 1726882531.76636: worker is 1 (out of 1 available) 18445 1726882531.76647: exiting _queue_task() for managed_node1/include_tasks 18445 1726882531.76656: done queuing things up, now waiting for results queue to drain 18445 1726882531.76658: waiting for pending results... 18445 1726882531.76801: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18445 1726882531.76856: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000010 18445 1726882531.76869: variable 'ansible_search_path' from source: unknown 18445 1726882531.76896: calling self._execute() 18445 1726882531.76944: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.76947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.76956: variable 'omit' from source: magic vars 18445 1726882531.77300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.78787: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.78833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.78867: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.78897: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.78915: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.78974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.78997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.79014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.79039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.79052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.79138: variable 'ansible_distribution' from source: facts 18445 1726882531.79141: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.79159: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.79162: when evaluation is False, skipping this task 18445 1726882531.79165: _execute() done 18445 1726882531.79176: dumping result to json 18445 1726882531.79179: done dumping result, returning 18445 1726882531.79181: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-f6eb-935c-000000000010] 18445 1726882531.79183: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000010 18445 1726882531.79252: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000010 18445 1726882531.79258: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.79327: no more pending results, returning what we have 18445 1726882531.79330: results queue empty 18445 1726882531.79330: checking for any_errors_fatal 18445 1726882531.79336: done checking for any_errors_fatal 18445 1726882531.79337: checking for max_fail_percentage 18445 1726882531.79338: done checking for max_fail_percentage 18445 1726882531.79339: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.79340: done checking to see if all hosts have failed 18445 1726882531.79340: getting the remaining hosts for this loop 18445 1726882531.79341: done getting the remaining hosts for this loop 18445 1726882531.79344: getting the next task for host managed_node1 18445 1726882531.79348: done getting next task for host managed_node1 18445 1726882531.79350: ^ task is: TASK: Include the task 'manage_test_interface.yml' 18445 1726882531.79352: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.79356: getting variables 18445 1726882531.79358: in VariableManager get_vars() 18445 1726882531.79382: Calling all_inventory to load vars for managed_node1 18445 1726882531.79384: Calling groups_inventory to load vars for managed_node1 18445 1726882531.79386: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.79394: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.79396: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.79398: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.79540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.79650: done with get_vars() 18445 1726882531.79658: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 21:35:31 -0400 (0:00:00.032) 0:00:03.734 ****** 18445 1726882531.79718: entering _queue_task() for managed_node1/include_tasks 18445 1726882531.79880: worker is 1 (out of 1 available) 18445 1726882531.79892: exiting _queue_task() for managed_node1/include_tasks 18445 1726882531.79902: done queuing things up, now waiting for results queue to drain 18445 1726882531.79904: waiting for pending results... 18445 1726882531.80037: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 18445 1726882531.80095: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000011 18445 1726882531.80105: variable 'ansible_search_path' from source: unknown 18445 1726882531.80137: calling self._execute() 18445 1726882531.80193: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.80196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.80204: variable 'omit' from source: magic vars 18445 1726882531.80497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.82031: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.82078: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.82105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.82129: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.82148: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.82218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.82238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.82255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.82284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.82296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.82389: variable 'ansible_distribution' from source: facts 18445 1726882531.82393: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.82411: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.82414: when evaluation is False, skipping this task 18445 1726882531.82418: _execute() done 18445 1726882531.82421: dumping result to json 18445 1726882531.82423: done dumping result, returning 18445 1726882531.82425: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-f6eb-935c-000000000011] 18445 1726882531.82429: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000011 18445 1726882531.82501: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000011 18445 1726882531.82504: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.82582: no more pending results, returning what we have 18445 1726882531.82584: results queue empty 18445 1726882531.82585: checking for any_errors_fatal 18445 1726882531.82589: done checking for any_errors_fatal 18445 1726882531.82589: checking for max_fail_percentage 18445 1726882531.82591: done checking for max_fail_percentage 18445 1726882531.82592: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.82592: done checking to see if all hosts have failed 18445 1726882531.82593: getting the remaining hosts for this loop 18445 1726882531.82594: done getting the remaining hosts for this loop 18445 1726882531.82597: getting the next task for host managed_node1 18445 1726882531.82601: done getting next task for host managed_node1 18445 1726882531.82603: ^ task is: TASK: Include the task 'assert_device_present.yml' 18445 1726882531.82604: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.82606: getting variables 18445 1726882531.82608: in VariableManager get_vars() 18445 1726882531.82630: Calling all_inventory to load vars for managed_node1 18445 1726882531.82632: Calling groups_inventory to load vars for managed_node1 18445 1726882531.82634: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.82640: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.82642: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.82644: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.82751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.82870: done with get_vars() 18445 1726882531.82878: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 21:35:31 -0400 (0:00:00.032) 0:00:03.766 ****** 18445 1726882531.82932: entering _queue_task() for managed_node1/include_tasks 18445 1726882531.83090: worker is 1 (out of 1 available) 18445 1726882531.83101: exiting _queue_task() for managed_node1/include_tasks 18445 1726882531.83111: done queuing things up, now waiting for results queue to drain 18445 1726882531.83112: waiting for pending results... 18445 1726882531.83251: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 18445 1726882531.83307: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000012 18445 1726882531.83317: variable 'ansible_search_path' from source: unknown 18445 1726882531.83344: calling self._execute() 18445 1726882531.83398: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.83402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.83413: variable 'omit' from source: magic vars 18445 1726882531.83742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.85214: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.85267: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.85292: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.85317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.85335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.85394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.85414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.85431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.85462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.85475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.85561: variable 'ansible_distribution' from source: facts 18445 1726882531.85573: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.85588: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.85591: when evaluation is False, skipping this task 18445 1726882531.85594: _execute() done 18445 1726882531.85596: dumping result to json 18445 1726882531.85598: done dumping result, returning 18445 1726882531.85603: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-f6eb-935c-000000000012] 18445 1726882531.85608: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000012 18445 1726882531.85680: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000012 18445 1726882531.85683: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.85752: no more pending results, returning what we have 18445 1726882531.85755: results queue empty 18445 1726882531.85756: checking for any_errors_fatal 18445 1726882531.85761: done checking for any_errors_fatal 18445 1726882531.85762: checking for max_fail_percentage 18445 1726882531.85765: done checking for max_fail_percentage 18445 1726882531.85766: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.85767: done checking to see if all hosts have failed 18445 1726882531.85767: getting the remaining hosts for this loop 18445 1726882531.85768: done getting the remaining hosts for this loop 18445 1726882531.85771: getting the next task for host managed_node1 18445 1726882531.85777: done getting next task for host managed_node1 18445 1726882531.85779: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.85780: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.85782: getting variables 18445 1726882531.85783: in VariableManager get_vars() 18445 1726882531.85806: Calling all_inventory to load vars for managed_node1 18445 1726882531.85807: Calling groups_inventory to load vars for managed_node1 18445 1726882531.85810: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.85816: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.85817: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.85819: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.85948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.86068: done with get_vars() 18445 1726882531.86074: done getting variables 18445 1726882531.86114: in VariableManager get_vars() 18445 1726882531.86121: Calling all_inventory to load vars for managed_node1 18445 1726882531.86123: Calling groups_inventory to load vars for managed_node1 18445 1726882531.86125: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.86129: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.86130: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.86132: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.86210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.86316: done with get_vars() 18445 1726882531.86324: done queuing things up, now waiting for results queue to drain 18445 1726882531.86325: results queue empty 18445 1726882531.86326: checking for any_errors_fatal 18445 1726882531.86327: done checking for any_errors_fatal 18445 1726882531.86327: checking for max_fail_percentage 18445 1726882531.86328: done checking for max_fail_percentage 18445 1726882531.86328: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.86329: done checking to see if all hosts have failed 18445 1726882531.86329: getting the remaining hosts for this loop 18445 1726882531.86330: done getting the remaining hosts for this loop 18445 1726882531.86331: getting the next task for host managed_node1 18445 1726882531.86333: done getting next task for host managed_node1 18445 1726882531.86334: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.86335: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.86337: getting variables 18445 1726882531.86338: in VariableManager get_vars() 18445 1726882531.86344: Calling all_inventory to load vars for managed_node1 18445 1726882531.86346: Calling groups_inventory to load vars for managed_node1 18445 1726882531.86347: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.86356: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.86357: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.86359: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.86449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.86552: done with get_vars() 18445 1726882531.86560: done getting variables 18445 1726882531.86590: in VariableManager get_vars() 18445 1726882531.86595: Calling all_inventory to load vars for managed_node1 18445 1726882531.86596: Calling groups_inventory to load vars for managed_node1 18445 1726882531.86598: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.86600: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.86602: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.86604: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.86686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.86791: done with get_vars() 18445 1726882531.86798: done queuing things up, now waiting for results queue to drain 18445 1726882531.86799: results queue empty 18445 1726882531.86800: checking for any_errors_fatal 18445 1726882531.86801: done checking for any_errors_fatal 18445 1726882531.86801: checking for max_fail_percentage 18445 1726882531.86802: done checking for max_fail_percentage 18445 1726882531.86802: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.86803: done checking to see if all hosts have failed 18445 1726882531.86803: getting the remaining hosts for this loop 18445 1726882531.86804: done getting the remaining hosts for this loop 18445 1726882531.86805: getting the next task for host managed_node1 18445 1726882531.86807: done getting next task for host managed_node1 18445 1726882531.86807: ^ task is: None 18445 1726882531.86808: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.86809: done queuing things up, now waiting for results queue to drain 18445 1726882531.86809: results queue empty 18445 1726882531.86810: checking for any_errors_fatal 18445 1726882531.86810: done checking for any_errors_fatal 18445 1726882531.86811: checking for max_fail_percentage 18445 1726882531.86811: done checking for max_fail_percentage 18445 1726882531.86812: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.86812: done checking to see if all hosts have failed 18445 1726882531.86813: getting the next task for host managed_node1 18445 1726882531.86814: done getting next task for host managed_node1 18445 1726882531.86814: ^ task is: None 18445 1726882531.86815: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.86841: in VariableManager get_vars() 18445 1726882531.86856: done with get_vars() 18445 1726882531.86860: in VariableManager get_vars() 18445 1726882531.86870: done with get_vars() 18445 1726882531.86872: variable 'omit' from source: magic vars 18445 1726882531.86893: in VariableManager get_vars() 18445 1726882531.86901: done with get_vars() 18445 1726882531.86912: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 18445 1726882531.87287: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882531.87304: getting the remaining hosts for this loop 18445 1726882531.87305: done getting the remaining hosts for this loop 18445 1726882531.87306: getting the next task for host managed_node1 18445 1726882531.87308: done getting next task for host managed_node1 18445 1726882531.87309: ^ task is: TASK: Gathering Facts 18445 1726882531.87310: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.87311: getting variables 18445 1726882531.87312: in VariableManager get_vars() 18445 1726882531.87320: Calling all_inventory to load vars for managed_node1 18445 1726882531.87322: Calling groups_inventory to load vars for managed_node1 18445 1726882531.87324: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.87327: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.87328: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.87330: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.87408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.87513: done with get_vars() 18445 1726882531.87518: done getting variables 18445 1726882531.87545: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 21:35:31 -0400 (0:00:00.046) 0:00:03.812 ****** 18445 1726882531.87562: entering _queue_task() for managed_node1/gather_facts 18445 1726882531.87706: worker is 1 (out of 1 available) 18445 1726882531.87716: exiting _queue_task() for managed_node1/gather_facts 18445 1726882531.87726: done queuing things up, now waiting for results queue to drain 18445 1726882531.87727: waiting for pending results... 18445 1726882531.87868: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882531.87927: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000010e 18445 1726882531.87938: variable 'ansible_search_path' from source: unknown 18445 1726882531.87967: calling self._execute() 18445 1726882531.88022: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.88025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.88033: variable 'omit' from source: magic vars 18445 1726882531.88320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.89960: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.89999: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.90033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.90061: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.90081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.90134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.90155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.90180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.90205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.90216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.90316: variable 'ansible_distribution' from source: facts 18445 1726882531.90319: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.90333: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.90335: when evaluation is False, skipping this task 18445 1726882531.90339: _execute() done 18445 1726882531.90342: dumping result to json 18445 1726882531.90344: done dumping result, returning 18445 1726882531.90348: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-00000000010e] 18445 1726882531.90359: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000010e 18445 1726882531.90424: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000010e 18445 1726882531.90427: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.90507: no more pending results, returning what we have 18445 1726882531.90510: results queue empty 18445 1726882531.90511: checking for any_errors_fatal 18445 1726882531.90512: done checking for any_errors_fatal 18445 1726882531.90513: checking for max_fail_percentage 18445 1726882531.90514: done checking for max_fail_percentage 18445 1726882531.90515: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.90516: done checking to see if all hosts have failed 18445 1726882531.90517: getting the remaining hosts for this loop 18445 1726882531.90518: done getting the remaining hosts for this loop 18445 1726882531.90521: getting the next task for host managed_node1 18445 1726882531.90525: done getting next task for host managed_node1 18445 1726882531.90527: ^ task is: TASK: meta (flush_handlers) 18445 1726882531.90528: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.90531: getting variables 18445 1726882531.90532: in VariableManager get_vars() 18445 1726882531.90562: Calling all_inventory to load vars for managed_node1 18445 1726882531.90565: Calling groups_inventory to load vars for managed_node1 18445 1726882531.90567: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.90573: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.90575: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.90578: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.90706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.90819: done with get_vars() 18445 1726882531.90825: done getting variables 18445 1726882531.90871: in VariableManager get_vars() 18445 1726882531.90878: Calling all_inventory to load vars for managed_node1 18445 1726882531.90879: Calling groups_inventory to load vars for managed_node1 18445 1726882531.90881: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.90883: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.90884: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.90886: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.90969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.91077: done with get_vars() 18445 1726882531.91085: done queuing things up, now waiting for results queue to drain 18445 1726882531.91086: results queue empty 18445 1726882531.91087: checking for any_errors_fatal 18445 1726882531.91088: done checking for any_errors_fatal 18445 1726882531.91089: checking for max_fail_percentage 18445 1726882531.91089: done checking for max_fail_percentage 18445 1726882531.91090: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.91090: done checking to see if all hosts have failed 18445 1726882531.91091: getting the remaining hosts for this loop 18445 1726882531.91091: done getting the remaining hosts for this loop 18445 1726882531.91093: getting the next task for host managed_node1 18445 1726882531.91095: done getting next task for host managed_node1 18445 1726882531.91097: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18445 1726882531.91097: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.91104: getting variables 18445 1726882531.91104: in VariableManager get_vars() 18445 1726882531.91112: Calling all_inventory to load vars for managed_node1 18445 1726882531.91113: Calling groups_inventory to load vars for managed_node1 18445 1726882531.91114: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.91121: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.91123: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.91124: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.91204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.91437: done with get_vars() 18445 1726882531.91442: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:31 -0400 (0:00:00.039) 0:00:03.851 ****** 18445 1726882531.91494: entering _queue_task() for managed_node1/include_tasks 18445 1726882531.91663: worker is 1 (out of 1 available) 18445 1726882531.91675: exiting _queue_task() for managed_node1/include_tasks 18445 1726882531.91686: done queuing things up, now waiting for results queue to drain 18445 1726882531.91688: waiting for pending results... 18445 1726882531.91835: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18445 1726882531.91902: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000019 18445 1726882531.91917: variable 'ansible_search_path' from source: unknown 18445 1726882531.91921: variable 'ansible_search_path' from source: unknown 18445 1726882531.91947: calling self._execute() 18445 1726882531.92009: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.92017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.92025: variable 'omit' from source: magic vars 18445 1726882531.92316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.93883: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.93924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.93953: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.93982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.94001: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.94056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.94083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.94101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.94126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.94137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.94228: variable 'ansible_distribution' from source: facts 18445 1726882531.94232: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.94246: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.94249: when evaluation is False, skipping this task 18445 1726882531.94252: _execute() done 18445 1726882531.94255: dumping result to json 18445 1726882531.94260: done dumping result, returning 18445 1726882531.94267: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-f6eb-935c-000000000019] 18445 1726882531.94273: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000019 18445 1726882531.94346: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000019 18445 1726882531.94349: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882531.94394: no more pending results, returning what we have 18445 1726882531.94397: results queue empty 18445 1726882531.94398: checking for any_errors_fatal 18445 1726882531.94400: done checking for any_errors_fatal 18445 1726882531.94401: checking for max_fail_percentage 18445 1726882531.94402: done checking for max_fail_percentage 18445 1726882531.94403: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.94404: done checking to see if all hosts have failed 18445 1726882531.94405: getting the remaining hosts for this loop 18445 1726882531.94406: done getting the remaining hosts for this loop 18445 1726882531.94410: getting the next task for host managed_node1 18445 1726882531.94414: done getting next task for host managed_node1 18445 1726882531.94418: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18445 1726882531.94420: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.94432: getting variables 18445 1726882531.94433: in VariableManager get_vars() 18445 1726882531.94467: Calling all_inventory to load vars for managed_node1 18445 1726882531.94470: Calling groups_inventory to load vars for managed_node1 18445 1726882531.94472: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.94479: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.94482: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.94484: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.94597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.94720: done with get_vars() 18445 1726882531.94728: done getting variables 18445 1726882531.94769: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:31 -0400 (0:00:00.032) 0:00:03.884 ****** 18445 1726882531.94789: entering _queue_task() for managed_node1/debug 18445 1726882531.94952: worker is 1 (out of 1 available) 18445 1726882531.94968: exiting _queue_task() for managed_node1/debug 18445 1726882531.94979: done queuing things up, now waiting for results queue to drain 18445 1726882531.94980: waiting for pending results... 18445 1726882531.95123: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18445 1726882531.95187: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000001a 18445 1726882531.95198: variable 'ansible_search_path' from source: unknown 18445 1726882531.95202: variable 'ansible_search_path' from source: unknown 18445 1726882531.95228: calling self._execute() 18445 1726882531.95284: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.95287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.95295: variable 'omit' from source: magic vars 18445 1726882531.95573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882531.97126: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882531.97170: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882531.97196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882531.97231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882531.97251: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882531.97308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882531.97330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882531.97352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882531.97385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882531.97396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882531.97488: variable 'ansible_distribution' from source: facts 18445 1726882531.97497: variable 'ansible_distribution_major_version' from source: facts 18445 1726882531.97510: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882531.97513: when evaluation is False, skipping this task 18445 1726882531.97516: _execute() done 18445 1726882531.97518: dumping result to json 18445 1726882531.97520: done dumping result, returning 18445 1726882531.97526: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-f6eb-935c-00000000001a] 18445 1726882531.97531: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001a 18445 1726882531.97608: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001a 18445 1726882531.97612: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882531.97657: no more pending results, returning what we have 18445 1726882531.97660: results queue empty 18445 1726882531.97661: checking for any_errors_fatal 18445 1726882531.97671: done checking for any_errors_fatal 18445 1726882531.97672: checking for max_fail_percentage 18445 1726882531.97673: done checking for max_fail_percentage 18445 1726882531.97674: checking to see if all hosts have failed and the running result is not ok 18445 1726882531.97675: done checking to see if all hosts have failed 18445 1726882531.97675: getting the remaining hosts for this loop 18445 1726882531.97677: done getting the remaining hosts for this loop 18445 1726882531.97680: getting the next task for host managed_node1 18445 1726882531.97684: done getting next task for host managed_node1 18445 1726882531.97688: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18445 1726882531.97690: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882531.97701: getting variables 18445 1726882531.97702: in VariableManager get_vars() 18445 1726882531.97730: Calling all_inventory to load vars for managed_node1 18445 1726882531.97732: Calling groups_inventory to load vars for managed_node1 18445 1726882531.97734: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882531.97741: Calling all_plugins_play to load vars for managed_node1 18445 1726882531.97744: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882531.97746: Calling groups_plugins_play to load vars for managed_node1 18445 1726882531.97849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882531.98002: done with get_vars() 18445 1726882531.98008: done getting variables 18445 1726882531.98067: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:31 -0400 (0:00:00.032) 0:00:03.917 ****** 18445 1726882531.98089: entering _queue_task() for managed_node1/fail 18445 1726882531.98091: Creating lock for fail 18445 1726882531.98267: worker is 1 (out of 1 available) 18445 1726882531.98279: exiting _queue_task() for managed_node1/fail 18445 1726882531.98290: done queuing things up, now waiting for results queue to drain 18445 1726882531.98292: waiting for pending results... 18445 1726882531.98428: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18445 1726882531.98497: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000001b 18445 1726882531.98507: variable 'ansible_search_path' from source: unknown 18445 1726882531.98510: variable 'ansible_search_path' from source: unknown 18445 1726882531.98542: calling self._execute() 18445 1726882531.98595: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882531.98598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882531.98607: variable 'omit' from source: magic vars 18445 1726882531.98890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.00398: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.00439: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.00468: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.00495: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.00514: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.00568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.00589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.00613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.00639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.00649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.00746: variable 'ansible_distribution' from source: facts 18445 1726882532.00750: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.00769: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.00772: when evaluation is False, skipping this task 18445 1726882532.00775: _execute() done 18445 1726882532.00777: dumping result to json 18445 1726882532.00779: done dumping result, returning 18445 1726882532.00785: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-f6eb-935c-00000000001b] 18445 1726882532.00791: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001b 18445 1726882532.00866: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001b 18445 1726882532.00869: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.00944: no more pending results, returning what we have 18445 1726882532.00947: results queue empty 18445 1726882532.00947: checking for any_errors_fatal 18445 1726882532.00951: done checking for any_errors_fatal 18445 1726882532.00951: checking for max_fail_percentage 18445 1726882532.00953: done checking for max_fail_percentage 18445 1726882532.00954: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.00955: done checking to see if all hosts have failed 18445 1726882532.00955: getting the remaining hosts for this loop 18445 1726882532.00957: done getting the remaining hosts for this loop 18445 1726882532.00959: getting the next task for host managed_node1 18445 1726882532.00965: done getting next task for host managed_node1 18445 1726882532.00968: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18445 1726882532.00970: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.00985: getting variables 18445 1726882532.00987: in VariableManager get_vars() 18445 1726882532.01012: Calling all_inventory to load vars for managed_node1 18445 1726882532.01014: Calling groups_inventory to load vars for managed_node1 18445 1726882532.01016: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.01023: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.01025: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.01027: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.01132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.01257: done with get_vars() 18445 1726882532.01266: done getting variables 18445 1726882532.01303: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:32 -0400 (0:00:00.032) 0:00:03.950 ****** 18445 1726882532.01323: entering _queue_task() for managed_node1/fail 18445 1726882532.01486: worker is 1 (out of 1 available) 18445 1726882532.01497: exiting _queue_task() for managed_node1/fail 18445 1726882532.01506: done queuing things up, now waiting for results queue to drain 18445 1726882532.01508: waiting for pending results... 18445 1726882532.01664: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18445 1726882532.01722: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000001c 18445 1726882532.01731: variable 'ansible_search_path' from source: unknown 18445 1726882532.01738: variable 'ansible_search_path' from source: unknown 18445 1726882532.01770: calling self._execute() 18445 1726882532.01823: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.01827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.01837: variable 'omit' from source: magic vars 18445 1726882532.02129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.03673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.03716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.03750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.03778: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.03798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.03857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.03878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.03895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.03924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.03934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.04023: variable 'ansible_distribution' from source: facts 18445 1726882532.04030: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.04043: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.04046: when evaluation is False, skipping this task 18445 1726882532.04049: _execute() done 18445 1726882532.04051: dumping result to json 18445 1726882532.04053: done dumping result, returning 18445 1726882532.04069: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-f6eb-935c-00000000001c] 18445 1726882532.04072: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001c 18445 1726882532.04146: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001c 18445 1726882532.04148: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.04199: no more pending results, returning what we have 18445 1726882532.04202: results queue empty 18445 1726882532.04202: checking for any_errors_fatal 18445 1726882532.04207: done checking for any_errors_fatal 18445 1726882532.04207: checking for max_fail_percentage 18445 1726882532.04209: done checking for max_fail_percentage 18445 1726882532.04210: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.04210: done checking to see if all hosts have failed 18445 1726882532.04211: getting the remaining hosts for this loop 18445 1726882532.04212: done getting the remaining hosts for this loop 18445 1726882532.04215: getting the next task for host managed_node1 18445 1726882532.04219: done getting next task for host managed_node1 18445 1726882532.04222: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18445 1726882532.04224: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.04235: getting variables 18445 1726882532.04236: in VariableManager get_vars() 18445 1726882532.04274: Calling all_inventory to load vars for managed_node1 18445 1726882532.04276: Calling groups_inventory to load vars for managed_node1 18445 1726882532.04278: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.04286: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.04287: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.04289: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.04423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.04539: done with get_vars() 18445 1726882532.04545: done getting variables 18445 1726882532.04586: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:32 -0400 (0:00:00.032) 0:00:03.983 ****** 18445 1726882532.04606: entering _queue_task() for managed_node1/fail 18445 1726882532.04769: worker is 1 (out of 1 available) 18445 1726882532.04780: exiting _queue_task() for managed_node1/fail 18445 1726882532.04790: done queuing things up, now waiting for results queue to drain 18445 1726882532.04792: waiting for pending results... 18445 1726882532.04928: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18445 1726882532.04993: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000001d 18445 1726882532.05003: variable 'ansible_search_path' from source: unknown 18445 1726882532.05006: variable 'ansible_search_path' from source: unknown 18445 1726882532.05038: calling self._execute() 18445 1726882532.05092: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.05097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.05104: variable 'omit' from source: magic vars 18445 1726882532.05390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.06886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.06928: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.06953: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.06979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.07007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.07065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.07088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.07111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.07138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.07149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.07240: variable 'ansible_distribution' from source: facts 18445 1726882532.07244: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.07259: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.07262: when evaluation is False, skipping this task 18445 1726882532.07266: _execute() done 18445 1726882532.07269: dumping result to json 18445 1726882532.07271: done dumping result, returning 18445 1726882532.07273: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-f6eb-935c-00000000001d] 18445 1726882532.07279: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001d 18445 1726882532.07357: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001d 18445 1726882532.07360: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.07431: no more pending results, returning what we have 18445 1726882532.07434: results queue empty 18445 1726882532.07434: checking for any_errors_fatal 18445 1726882532.07439: done checking for any_errors_fatal 18445 1726882532.07439: checking for max_fail_percentage 18445 1726882532.07441: done checking for max_fail_percentage 18445 1726882532.07442: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.07442: done checking to see if all hosts have failed 18445 1726882532.07443: getting the remaining hosts for this loop 18445 1726882532.07444: done getting the remaining hosts for this loop 18445 1726882532.07447: getting the next task for host managed_node1 18445 1726882532.07450: done getting next task for host managed_node1 18445 1726882532.07453: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18445 1726882532.07456: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.07470: getting variables 18445 1726882532.07471: in VariableManager get_vars() 18445 1726882532.07498: Calling all_inventory to load vars for managed_node1 18445 1726882532.07499: Calling groups_inventory to load vars for managed_node1 18445 1726882532.07501: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.07506: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.07508: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.07510: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.07612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.07727: done with get_vars() 18445 1726882532.07734: done getting variables 18445 1726882532.07797: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:32 -0400 (0:00:00.032) 0:00:04.015 ****** 18445 1726882532.07818: entering _queue_task() for managed_node1/dnf 18445 1726882532.07969: worker is 1 (out of 1 available) 18445 1726882532.07979: exiting _queue_task() for managed_node1/dnf 18445 1726882532.07989: done queuing things up, now waiting for results queue to drain 18445 1726882532.07991: waiting for pending results... 18445 1726882532.08137: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18445 1726882532.08197: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000001e 18445 1726882532.08207: variable 'ansible_search_path' from source: unknown 18445 1726882532.08212: variable 'ansible_search_path' from source: unknown 18445 1726882532.08237: calling self._execute() 18445 1726882532.08294: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.08297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.08305: variable 'omit' from source: magic vars 18445 1726882532.08589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.10120: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.10183: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.10232: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.10271: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.10301: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.10379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.10413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.10442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.10489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.10509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.10634: variable 'ansible_distribution' from source: facts 18445 1726882532.10649: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.10674: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.10681: when evaluation is False, skipping this task 18445 1726882532.10687: _execute() done 18445 1726882532.10692: dumping result to json 18445 1726882532.10699: done dumping result, returning 18445 1726882532.10710: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-00000000001e] 18445 1726882532.10719: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.10855: no more pending results, returning what we have 18445 1726882532.10859: results queue empty 18445 1726882532.10860: checking for any_errors_fatal 18445 1726882532.10868: done checking for any_errors_fatal 18445 1726882532.10869: checking for max_fail_percentage 18445 1726882532.10871: done checking for max_fail_percentage 18445 1726882532.10872: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.10872: done checking to see if all hosts have failed 18445 1726882532.10873: getting the remaining hosts for this loop 18445 1726882532.10875: done getting the remaining hosts for this loop 18445 1726882532.10878: getting the next task for host managed_node1 18445 1726882532.10884: done getting next task for host managed_node1 18445 1726882532.10888: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18445 1726882532.10891: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.10902: getting variables 18445 1726882532.10904: in VariableManager get_vars() 18445 1726882532.10940: Calling all_inventory to load vars for managed_node1 18445 1726882532.10942: Calling groups_inventory to load vars for managed_node1 18445 1726882532.10945: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.10954: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.10957: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.10960: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.11171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.11358: done with get_vars() 18445 1726882532.11369: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18445 1726882532.11444: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:32 -0400 (0:00:00.038) 0:00:04.054 ****** 18445 1726882532.11706: entering _queue_task() for managed_node1/yum 18445 1726882532.11708: Creating lock for yum 18445 1726882532.11736: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001e 18445 1726882532.11739: WORKER PROCESS EXITING 18445 1726882532.11942: worker is 1 (out of 1 available) 18445 1726882532.11954: exiting _queue_task() for managed_node1/yum 18445 1726882532.11967: done queuing things up, now waiting for results queue to drain 18445 1726882532.11969: waiting for pending results... 18445 1726882532.12211: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18445 1726882532.12311: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000001f 18445 1726882532.12330: variable 'ansible_search_path' from source: unknown 18445 1726882532.12337: variable 'ansible_search_path' from source: unknown 18445 1726882532.12379: calling self._execute() 18445 1726882532.12457: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.12472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.12485: variable 'omit' from source: magic vars 18445 1726882532.12901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.14607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.14651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.14681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.14707: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.14726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.14791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.14814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.14833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.14869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.14879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.14976: variable 'ansible_distribution' from source: facts 18445 1726882532.14979: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.14994: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.14998: when evaluation is False, skipping this task 18445 1726882532.15000: _execute() done 18445 1726882532.15002: dumping result to json 18445 1726882532.15005: done dumping result, returning 18445 1726882532.15013: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-00000000001f] 18445 1726882532.15016: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001f 18445 1726882532.15099: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000001f 18445 1726882532.15102: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.15162: no more pending results, returning what we have 18445 1726882532.15167: results queue empty 18445 1726882532.15168: checking for any_errors_fatal 18445 1726882532.15174: done checking for any_errors_fatal 18445 1726882532.15175: checking for max_fail_percentage 18445 1726882532.15176: done checking for max_fail_percentage 18445 1726882532.15177: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.15178: done checking to see if all hosts have failed 18445 1726882532.15178: getting the remaining hosts for this loop 18445 1726882532.15180: done getting the remaining hosts for this loop 18445 1726882532.15183: getting the next task for host managed_node1 18445 1726882532.15188: done getting next task for host managed_node1 18445 1726882532.15191: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18445 1726882532.15193: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.15204: getting variables 18445 1726882532.15205: in VariableManager get_vars() 18445 1726882532.15235: Calling all_inventory to load vars for managed_node1 18445 1726882532.15238: Calling groups_inventory to load vars for managed_node1 18445 1726882532.15240: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.15248: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.15250: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.15256: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.15362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.15482: done with get_vars() 18445 1726882532.15490: done getting variables 18445 1726882532.15527: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:32 -0400 (0:00:00.038) 0:00:04.092 ****** 18445 1726882532.15547: entering _queue_task() for managed_node1/fail 18445 1726882532.15722: worker is 1 (out of 1 available) 18445 1726882532.15735: exiting _queue_task() for managed_node1/fail 18445 1726882532.15745: done queuing things up, now waiting for results queue to drain 18445 1726882532.15747: waiting for pending results... 18445 1726882532.15904: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18445 1726882532.15966: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000020 18445 1726882532.15976: variable 'ansible_search_path' from source: unknown 18445 1726882532.15980: variable 'ansible_search_path' from source: unknown 18445 1726882532.16008: calling self._execute() 18445 1726882532.16070: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.16074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.16083: variable 'omit' from source: magic vars 18445 1726882532.16365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.17901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.17943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.17983: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.18006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.18025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.18083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.18106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.18123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.18150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.18162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.18256: variable 'ansible_distribution' from source: facts 18445 1726882532.18260: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.18276: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.18279: when evaluation is False, skipping this task 18445 1726882532.18281: _execute() done 18445 1726882532.18284: dumping result to json 18445 1726882532.18286: done dumping result, returning 18445 1726882532.18293: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000020] 18445 1726882532.18297: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000020 18445 1726882532.18383: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000020 18445 1726882532.18385: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.18433: no more pending results, returning what we have 18445 1726882532.18436: results queue empty 18445 1726882532.18436: checking for any_errors_fatal 18445 1726882532.18443: done checking for any_errors_fatal 18445 1726882532.18443: checking for max_fail_percentage 18445 1726882532.18445: done checking for max_fail_percentage 18445 1726882532.18446: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.18447: done checking to see if all hosts have failed 18445 1726882532.18447: getting the remaining hosts for this loop 18445 1726882532.18449: done getting the remaining hosts for this loop 18445 1726882532.18453: getting the next task for host managed_node1 18445 1726882532.18457: done getting next task for host managed_node1 18445 1726882532.18461: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18445 1726882532.18465: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.18476: getting variables 18445 1726882532.18477: in VariableManager get_vars() 18445 1726882532.18507: Calling all_inventory to load vars for managed_node1 18445 1726882532.18509: Calling groups_inventory to load vars for managed_node1 18445 1726882532.18512: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.18523: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.18526: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.18529: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.18683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.18803: done with get_vars() 18445 1726882532.18810: done getting variables 18445 1726882532.18851: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:32 -0400 (0:00:00.033) 0:00:04.125 ****** 18445 1726882532.18875: entering _queue_task() for managed_node1/package 18445 1726882532.19041: worker is 1 (out of 1 available) 18445 1726882532.19051: exiting _queue_task() for managed_node1/package 18445 1726882532.19066: done queuing things up, now waiting for results queue to drain 18445 1726882532.19067: waiting for pending results... 18445 1726882532.19223: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18445 1726882532.19281: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000021 18445 1726882532.19297: variable 'ansible_search_path' from source: unknown 18445 1726882532.19300: variable 'ansible_search_path' from source: unknown 18445 1726882532.19328: calling self._execute() 18445 1726882532.19383: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.19387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.19396: variable 'omit' from source: magic vars 18445 1726882532.19683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.21194: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.21234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.21272: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.21297: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.21316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.21378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.21398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.21415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.21441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.21451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.21550: variable 'ansible_distribution' from source: facts 18445 1726882532.21555: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.21576: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.21579: when evaluation is False, skipping this task 18445 1726882532.21581: _execute() done 18445 1726882532.21584: dumping result to json 18445 1726882532.21588: done dumping result, returning 18445 1726882532.21594: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-f6eb-935c-000000000021] 18445 1726882532.21601: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000021 18445 1726882532.21679: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000021 18445 1726882532.21682: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.21739: no more pending results, returning what we have 18445 1726882532.21742: results queue empty 18445 1726882532.21743: checking for any_errors_fatal 18445 1726882532.21747: done checking for any_errors_fatal 18445 1726882532.21748: checking for max_fail_percentage 18445 1726882532.21749: done checking for max_fail_percentage 18445 1726882532.21750: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.21751: done checking to see if all hosts have failed 18445 1726882532.21752: getting the remaining hosts for this loop 18445 1726882532.21753: done getting the remaining hosts for this loop 18445 1726882532.21758: getting the next task for host managed_node1 18445 1726882532.21762: done getting next task for host managed_node1 18445 1726882532.21767: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18445 1726882532.21769: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.21780: getting variables 18445 1726882532.21781: in VariableManager get_vars() 18445 1726882532.21808: Calling all_inventory to load vars for managed_node1 18445 1726882532.21810: Calling groups_inventory to load vars for managed_node1 18445 1726882532.21812: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.21819: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.21821: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.21823: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.21926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.22042: done with get_vars() 18445 1726882532.22051: done getting variables 18445 1726882532.22090: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:32 -0400 (0:00:00.032) 0:00:04.158 ****** 18445 1726882532.22112: entering _queue_task() for managed_node1/package 18445 1726882532.22273: worker is 1 (out of 1 available) 18445 1726882532.22285: exiting _queue_task() for managed_node1/package 18445 1726882532.22295: done queuing things up, now waiting for results queue to drain 18445 1726882532.22297: waiting for pending results... 18445 1726882532.22437: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18445 1726882532.22497: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000022 18445 1726882532.22507: variable 'ansible_search_path' from source: unknown 18445 1726882532.22510: variable 'ansible_search_path' from source: unknown 18445 1726882532.22535: calling self._execute() 18445 1726882532.22592: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.22595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.22603: variable 'omit' from source: magic vars 18445 1726882532.22873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.24374: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.24416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.24451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.24482: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.24501: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.24564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.24586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.24604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.24632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.24647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.24743: variable 'ansible_distribution' from source: facts 18445 1726882532.24748: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.24762: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.24768: when evaluation is False, skipping this task 18445 1726882532.24773: _execute() done 18445 1726882532.24775: dumping result to json 18445 1726882532.24779: done dumping result, returning 18445 1726882532.24785: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-f6eb-935c-000000000022] 18445 1726882532.24791: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000022 18445 1726882532.24872: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000022 18445 1726882532.24875: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.24936: no more pending results, returning what we have 18445 1726882532.24938: results queue empty 18445 1726882532.24939: checking for any_errors_fatal 18445 1726882532.24944: done checking for any_errors_fatal 18445 1726882532.24945: checking for max_fail_percentage 18445 1726882532.24946: done checking for max_fail_percentage 18445 1726882532.24947: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.24948: done checking to see if all hosts have failed 18445 1726882532.24949: getting the remaining hosts for this loop 18445 1726882532.24950: done getting the remaining hosts for this loop 18445 1726882532.24953: getting the next task for host managed_node1 18445 1726882532.24959: done getting next task for host managed_node1 18445 1726882532.24962: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18445 1726882532.24965: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.24975: getting variables 18445 1726882532.24977: in VariableManager get_vars() 18445 1726882532.25004: Calling all_inventory to load vars for managed_node1 18445 1726882532.25006: Calling groups_inventory to load vars for managed_node1 18445 1726882532.25008: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.25013: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.25015: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.25017: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.25149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.25272: done with get_vars() 18445 1726882532.25283: done getting variables 18445 1726882532.25342: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:32 -0400 (0:00:00.032) 0:00:04.190 ****** 18445 1726882532.25383: entering _queue_task() for managed_node1/package 18445 1726882532.25562: worker is 1 (out of 1 available) 18445 1726882532.25577: exiting _queue_task() for managed_node1/package 18445 1726882532.25588: done queuing things up, now waiting for results queue to drain 18445 1726882532.25589: waiting for pending results... 18445 1726882532.25732: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18445 1726882532.25789: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000023 18445 1726882532.25799: variable 'ansible_search_path' from source: unknown 18445 1726882532.25802: variable 'ansible_search_path' from source: unknown 18445 1726882532.25829: calling self._execute() 18445 1726882532.25885: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.25890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.25897: variable 'omit' from source: magic vars 18445 1726882532.26178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.28210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.28295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.28332: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.28387: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.28418: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.28511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.28545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.28588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.28639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.28676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.28789: variable 'ansible_distribution' from source: facts 18445 1726882532.28812: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.28834: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.28837: when evaluation is False, skipping this task 18445 1726882532.28840: _execute() done 18445 1726882532.28842: dumping result to json 18445 1726882532.28845: done dumping result, returning 18445 1726882532.28850: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-f6eb-935c-000000000023] 18445 1726882532.28858: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000023 18445 1726882532.28937: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000023 18445 1726882532.28940: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.28987: no more pending results, returning what we have 18445 1726882532.28990: results queue empty 18445 1726882532.28991: checking for any_errors_fatal 18445 1726882532.28997: done checking for any_errors_fatal 18445 1726882532.28998: checking for max_fail_percentage 18445 1726882532.28999: done checking for max_fail_percentage 18445 1726882532.29001: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.29001: done checking to see if all hosts have failed 18445 1726882532.29002: getting the remaining hosts for this loop 18445 1726882532.29003: done getting the remaining hosts for this loop 18445 1726882532.29007: getting the next task for host managed_node1 18445 1726882532.29011: done getting next task for host managed_node1 18445 1726882532.29014: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18445 1726882532.29017: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.29028: getting variables 18445 1726882532.29029: in VariableManager get_vars() 18445 1726882532.29062: Calling all_inventory to load vars for managed_node1 18445 1726882532.29073: Calling groups_inventory to load vars for managed_node1 18445 1726882532.29076: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.29084: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.29087: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.29089: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.29199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.29317: done with get_vars() 18445 1726882532.29324: done getting variables 18445 1726882532.29390: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:32 -0400 (0:00:00.040) 0:00:04.231 ****** 18445 1726882532.29412: entering _queue_task() for managed_node1/service 18445 1726882532.29414: Creating lock for service 18445 1726882532.29587: worker is 1 (out of 1 available) 18445 1726882532.29600: exiting _queue_task() for managed_node1/service 18445 1726882532.29611: done queuing things up, now waiting for results queue to drain 18445 1726882532.29612: waiting for pending results... 18445 1726882532.29778: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18445 1726882532.29837: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000024 18445 1726882532.29846: variable 'ansible_search_path' from source: unknown 18445 1726882532.29849: variable 'ansible_search_path' from source: unknown 18445 1726882532.29882: calling self._execute() 18445 1726882532.29931: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.29937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.29950: variable 'omit' from source: magic vars 18445 1726882532.30245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.32510: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.32578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.32628: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.32677: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.32707: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.32788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.32821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.32850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.32898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.32917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.33053: variable 'ansible_distribution' from source: facts 18445 1726882532.33067: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.33091: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.33098: when evaluation is False, skipping this task 18445 1726882532.33104: _execute() done 18445 1726882532.33110: dumping result to json 18445 1726882532.33117: done dumping result, returning 18445 1726882532.33127: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000024] 18445 1726882532.33136: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000024 18445 1726882532.33244: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000024 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.33296: no more pending results, returning what we have 18445 1726882532.33300: results queue empty 18445 1726882532.33300: checking for any_errors_fatal 18445 1726882532.33307: done checking for any_errors_fatal 18445 1726882532.33307: checking for max_fail_percentage 18445 1726882532.33309: done checking for max_fail_percentage 18445 1726882532.33310: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.33311: done checking to see if all hosts have failed 18445 1726882532.33311: getting the remaining hosts for this loop 18445 1726882532.33313: done getting the remaining hosts for this loop 18445 1726882532.33316: getting the next task for host managed_node1 18445 1726882532.33321: done getting next task for host managed_node1 18445 1726882532.33324: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18445 1726882532.33326: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.33338: getting variables 18445 1726882532.33339: in VariableManager get_vars() 18445 1726882532.33377: Calling all_inventory to load vars for managed_node1 18445 1726882532.33379: Calling groups_inventory to load vars for managed_node1 18445 1726882532.33381: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.33390: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.33393: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.33395: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.33632: WORKER PROCESS EXITING 18445 1726882532.33653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.33874: done with get_vars() 18445 1726882532.33884: done getting variables 18445 1726882532.33946: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:32 -0400 (0:00:00.045) 0:00:04.276 ****** 18445 1726882532.33979: entering _queue_task() for managed_node1/service 18445 1726882532.34869: worker is 1 (out of 1 available) 18445 1726882532.34880: exiting _queue_task() for managed_node1/service 18445 1726882532.34889: done queuing things up, now waiting for results queue to drain 18445 1726882532.34891: waiting for pending results... 18445 1726882532.35144: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18445 1726882532.35238: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000025 18445 1726882532.35255: variable 'ansible_search_path' from source: unknown 18445 1726882532.35265: variable 'ansible_search_path' from source: unknown 18445 1726882532.35304: calling self._execute() 18445 1726882532.35385: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.35399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.35414: variable 'omit' from source: magic vars 18445 1726882532.35818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.40270: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.40338: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.40385: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.40423: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.40451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.40539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.40581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.40612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.40655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.40680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.40818: variable 'ansible_distribution' from source: facts 18445 1726882532.40829: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.40852: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.40860: when evaluation is False, skipping this task 18445 1726882532.40869: _execute() done 18445 1726882532.40875: dumping result to json 18445 1726882532.40881: done dumping result, returning 18445 1726882532.40893: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-f6eb-935c-000000000025] 18445 1726882532.40904: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000025 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18445 1726882532.41043: no more pending results, returning what we have 18445 1726882532.41047: results queue empty 18445 1726882532.41047: checking for any_errors_fatal 18445 1726882532.41055: done checking for any_errors_fatal 18445 1726882532.41056: checking for max_fail_percentage 18445 1726882532.41058: done checking for max_fail_percentage 18445 1726882532.41059: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.41060: done checking to see if all hosts have failed 18445 1726882532.41061: getting the remaining hosts for this loop 18445 1726882532.41062: done getting the remaining hosts for this loop 18445 1726882532.41068: getting the next task for host managed_node1 18445 1726882532.41074: done getting next task for host managed_node1 18445 1726882532.41078: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18445 1726882532.41080: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.41092: getting variables 18445 1726882532.41093: in VariableManager get_vars() 18445 1726882532.41130: Calling all_inventory to load vars for managed_node1 18445 1726882532.41133: Calling groups_inventory to load vars for managed_node1 18445 1726882532.41135: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.41145: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.41148: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.41151: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.41328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.41535: done with get_vars() 18445 1726882532.41545: done getting variables 18445 1726882532.41607: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:32 -0400 (0:00:00.076) 0:00:04.353 ****** 18445 1726882532.41644: entering _queue_task() for managed_node1/service 18445 1726882532.41662: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000025 18445 1726882532.41673: WORKER PROCESS EXITING 18445 1726882532.42189: worker is 1 (out of 1 available) 18445 1726882532.42198: exiting _queue_task() for managed_node1/service 18445 1726882532.42209: done queuing things up, now waiting for results queue to drain 18445 1726882532.42210: waiting for pending results... 18445 1726882532.42453: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18445 1726882532.42545: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000026 18445 1726882532.42562: variable 'ansible_search_path' from source: unknown 18445 1726882532.42572: variable 'ansible_search_path' from source: unknown 18445 1726882532.42610: calling self._execute() 18445 1726882532.42686: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.42697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.42709: variable 'omit' from source: magic vars 18445 1726882532.43112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.45932: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.46112: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.46294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.46338: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.46366: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.46452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.46607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.46641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.46778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.46844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.47113: variable 'ansible_distribution' from source: facts 18445 1726882532.47124: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.47181: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.47189: when evaluation is False, skipping this task 18445 1726882532.47196: _execute() done 18445 1726882532.47203: dumping result to json 18445 1726882532.47210: done dumping result, returning 18445 1726882532.47220: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-f6eb-935c-000000000026] 18445 1726882532.47229: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000026 18445 1726882532.47342: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000026 18445 1726882532.47351: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.47400: no more pending results, returning what we have 18445 1726882532.47404: results queue empty 18445 1726882532.47405: checking for any_errors_fatal 18445 1726882532.47410: done checking for any_errors_fatal 18445 1726882532.47411: checking for max_fail_percentage 18445 1726882532.47413: done checking for max_fail_percentage 18445 1726882532.47415: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.47415: done checking to see if all hosts have failed 18445 1726882532.47416: getting the remaining hosts for this loop 18445 1726882532.47418: done getting the remaining hosts for this loop 18445 1726882532.47422: getting the next task for host managed_node1 18445 1726882532.47429: done getting next task for host managed_node1 18445 1726882532.47433: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18445 1726882532.47435: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.47449: getting variables 18445 1726882532.47451: in VariableManager get_vars() 18445 1726882532.47489: Calling all_inventory to load vars for managed_node1 18445 1726882532.47492: Calling groups_inventory to load vars for managed_node1 18445 1726882532.47494: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.47504: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.47507: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.47510: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.47737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.47934: done with get_vars() 18445 1726882532.47943: done getting variables 18445 1726882532.48244: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:32 -0400 (0:00:00.066) 0:00:04.419 ****** 18445 1726882532.48276: entering _queue_task() for managed_node1/service 18445 1726882532.48501: worker is 1 (out of 1 available) 18445 1726882532.48514: exiting _queue_task() for managed_node1/service 18445 1726882532.48524: done queuing things up, now waiting for results queue to drain 18445 1726882532.48525: waiting for pending results... 18445 1726882532.48779: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18445 1726882532.48873: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000027 18445 1726882532.48891: variable 'ansible_search_path' from source: unknown 18445 1726882532.48899: variable 'ansible_search_path' from source: unknown 18445 1726882532.49570: calling self._execute() 18445 1726882532.49650: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.49660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.49676: variable 'omit' from source: magic vars 18445 1726882532.50088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.54359: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.54543: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.54735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.54777: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.54809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.54974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.55011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.55041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.55094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.55114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.55266: variable 'ansible_distribution' from source: facts 18445 1726882532.55283: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.55306: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.55314: when evaluation is False, skipping this task 18445 1726882532.55322: _execute() done 18445 1726882532.55328: dumping result to json 18445 1726882532.55335: done dumping result, returning 18445 1726882532.55346: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-f6eb-935c-000000000027] 18445 1726882532.55356: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000027 18445 1726882532.55468: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000027 18445 1726882532.55476: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18445 1726882532.55534: no more pending results, returning what we have 18445 1726882532.55538: results queue empty 18445 1726882532.55539: checking for any_errors_fatal 18445 1726882532.55547: done checking for any_errors_fatal 18445 1726882532.55548: checking for max_fail_percentage 18445 1726882532.55550: done checking for max_fail_percentage 18445 1726882532.55551: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.55552: done checking to see if all hosts have failed 18445 1726882532.55553: getting the remaining hosts for this loop 18445 1726882532.55555: done getting the remaining hosts for this loop 18445 1726882532.55559: getting the next task for host managed_node1 18445 1726882532.55567: done getting next task for host managed_node1 18445 1726882532.55571: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18445 1726882532.55573: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.55586: getting variables 18445 1726882532.55588: in VariableManager get_vars() 18445 1726882532.55627: Calling all_inventory to load vars for managed_node1 18445 1726882532.55630: Calling groups_inventory to load vars for managed_node1 18445 1726882532.55633: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.55644: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.55647: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.55651: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.55825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.56025: done with get_vars() 18445 1726882532.56036: done getting variables 18445 1726882532.56103: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:32 -0400 (0:00:00.078) 0:00:04.498 ****** 18445 1726882532.56140: entering _queue_task() for managed_node1/copy 18445 1726882532.56644: worker is 1 (out of 1 available) 18445 1726882532.57074: exiting _queue_task() for managed_node1/copy 18445 1726882532.57083: done queuing things up, now waiting for results queue to drain 18445 1726882532.57084: waiting for pending results... 18445 1726882532.58231: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18445 1726882532.58344: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000028 18445 1726882532.58465: variable 'ansible_search_path' from source: unknown 18445 1726882532.58475: variable 'ansible_search_path' from source: unknown 18445 1726882532.58538: calling self._execute() 18445 1726882532.58689: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.58800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.58821: variable 'omit' from source: magic vars 18445 1726882532.59273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.62642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.62768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.62810: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.62866: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.62903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.62999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.63036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.63070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.63122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.63143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.63296: variable 'ansible_distribution' from source: facts 18445 1726882532.63314: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.63341: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.63349: when evaluation is False, skipping this task 18445 1726882532.63366: _execute() done 18445 1726882532.63375: dumping result to json 18445 1726882532.63382: done dumping result, returning 18445 1726882532.63394: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-f6eb-935c-000000000028] 18445 1726882532.63405: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.63558: no more pending results, returning what we have 18445 1726882532.63562: results queue empty 18445 1726882532.63563: checking for any_errors_fatal 18445 1726882532.63569: done checking for any_errors_fatal 18445 1726882532.63570: checking for max_fail_percentage 18445 1726882532.63571: done checking for max_fail_percentage 18445 1726882532.63572: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.63573: done checking to see if all hosts have failed 18445 1726882532.63574: getting the remaining hosts for this loop 18445 1726882532.63576: done getting the remaining hosts for this loop 18445 1726882532.63580: getting the next task for host managed_node1 18445 1726882532.63585: done getting next task for host managed_node1 18445 1726882532.63589: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18445 1726882532.63591: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.63603: getting variables 18445 1726882532.63605: in VariableManager get_vars() 18445 1726882532.63641: Calling all_inventory to load vars for managed_node1 18445 1726882532.63643: Calling groups_inventory to load vars for managed_node1 18445 1726882532.63646: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.63658: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.63661: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.63665: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.63899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.64124: done with get_vars() 18445 1726882532.64134: done getting variables 18445 1726882532.64311: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000028 18445 1726882532.64314: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:32 -0400 (0:00:00.082) 0:00:04.580 ****** 18445 1726882532.64357: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18445 1726882532.64359: Creating lock for fedora.linux_system_roles.network_connections 18445 1726882532.65043: worker is 1 (out of 1 available) 18445 1726882532.65057: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18445 1726882532.65185: done queuing things up, now waiting for results queue to drain 18445 1726882532.65187: waiting for pending results... 18445 1726882532.65360: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18445 1726882532.65449: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000029 18445 1726882532.65470: variable 'ansible_search_path' from source: unknown 18445 1726882532.65477: variable 'ansible_search_path' from source: unknown 18445 1726882532.65515: calling self._execute() 18445 1726882532.65617: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.65631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.65644: variable 'omit' from source: magic vars 18445 1726882532.66061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.69489: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.69554: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.69596: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.69636: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.69682: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.69770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.69809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.69838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.69897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.69917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.70070: variable 'ansible_distribution' from source: facts 18445 1726882532.70087: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.70117: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.70124: when evaluation is False, skipping this task 18445 1726882532.70131: _execute() done 18445 1726882532.70136: dumping result to json 18445 1726882532.70144: done dumping result, returning 18445 1726882532.70200: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-f6eb-935c-000000000029] 18445 1726882532.70220: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000029 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.70400: no more pending results, returning what we have 18445 1726882532.70404: results queue empty 18445 1726882532.70405: checking for any_errors_fatal 18445 1726882532.70412: done checking for any_errors_fatal 18445 1726882532.70413: checking for max_fail_percentage 18445 1726882532.70415: done checking for max_fail_percentage 18445 1726882532.70417: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.70417: done checking to see if all hosts have failed 18445 1726882532.70419: getting the remaining hosts for this loop 18445 1726882532.70420: done getting the remaining hosts for this loop 18445 1726882532.70424: getting the next task for host managed_node1 18445 1726882532.70449: done getting next task for host managed_node1 18445 1726882532.70454: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18445 1726882532.70456: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.70474: getting variables 18445 1726882532.70476: in VariableManager get_vars() 18445 1726882532.70516: Calling all_inventory to load vars for managed_node1 18445 1726882532.70519: Calling groups_inventory to load vars for managed_node1 18445 1726882532.70521: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.70532: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.70535: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.70538: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.70750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.71828: done with get_vars() 18445 1726882532.71840: done getting variables 18445 1726882532.72917: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000029 18445 1726882532.72921: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:32 -0400 (0:00:00.086) 0:00:04.666 ****** 18445 1726882532.72975: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18445 1726882532.72977: Creating lock for fedora.linux_system_roles.network_state 18445 1726882532.73650: worker is 1 (out of 1 available) 18445 1726882532.73660: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18445 1726882532.73674: done queuing things up, now waiting for results queue to drain 18445 1726882532.73675: waiting for pending results... 18445 1726882532.74065: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18445 1726882532.74169: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000002a 18445 1726882532.74189: variable 'ansible_search_path' from source: unknown 18445 1726882532.74196: variable 'ansible_search_path' from source: unknown 18445 1726882532.74233: calling self._execute() 18445 1726882532.74312: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.74322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.74332: variable 'omit' from source: magic vars 18445 1726882532.75128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.78184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.78279: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.78896: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.78933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.78989: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.79144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.79211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.79320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.79367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.79497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.79921: variable 'ansible_distribution' from source: facts 18445 1726882532.79931: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.79952: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.79965: when evaluation is False, skipping this task 18445 1726882532.79973: _execute() done 18445 1726882532.79980: dumping result to json 18445 1726882532.80031: done dumping result, returning 18445 1726882532.80042: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-f6eb-935c-00000000002a] 18445 1726882532.80052: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002a 18445 1726882532.80154: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002a 18445 1726882532.80160: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882532.80222: no more pending results, returning what we have 18445 1726882532.80226: results queue empty 18445 1726882532.80226: checking for any_errors_fatal 18445 1726882532.80232: done checking for any_errors_fatal 18445 1726882532.80233: checking for max_fail_percentage 18445 1726882532.80234: done checking for max_fail_percentage 18445 1726882532.80236: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.80236: done checking to see if all hosts have failed 18445 1726882532.80237: getting the remaining hosts for this loop 18445 1726882532.80238: done getting the remaining hosts for this loop 18445 1726882532.80242: getting the next task for host managed_node1 18445 1726882532.80247: done getting next task for host managed_node1 18445 1726882532.80251: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18445 1726882532.80256: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.80269: getting variables 18445 1726882532.80270: in VariableManager get_vars() 18445 1726882532.80305: Calling all_inventory to load vars for managed_node1 18445 1726882532.80307: Calling groups_inventory to load vars for managed_node1 18445 1726882532.80310: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.80318: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.80321: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.80323: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.80540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.80759: done with get_vars() 18445 1726882532.80771: done getting variables 18445 1726882532.80830: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:32 -0400 (0:00:00.078) 0:00:04.745 ****** 18445 1726882532.80860: entering _queue_task() for managed_node1/debug 18445 1726882532.81097: worker is 1 (out of 1 available) 18445 1726882532.81107: exiting _queue_task() for managed_node1/debug 18445 1726882532.81121: done queuing things up, now waiting for results queue to drain 18445 1726882532.81123: waiting for pending results... 18445 1726882532.81387: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18445 1726882532.81496: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000002b 18445 1726882532.81515: variable 'ansible_search_path' from source: unknown 18445 1726882532.81522: variable 'ansible_search_path' from source: unknown 18445 1726882532.81570: calling self._execute() 18445 1726882532.81643: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.81652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.81675: variable 'omit' from source: magic vars 18445 1726882532.82112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.84914: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.84992: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.85035: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.85081: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.85115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.85210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.85246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.85287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.85331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.85349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.85498: variable 'ansible_distribution' from source: facts 18445 1726882532.85509: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.85528: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.85534: when evaluation is False, skipping this task 18445 1726882532.85540: _execute() done 18445 1726882532.85546: dumping result to json 18445 1726882532.85552: done dumping result, returning 18445 1726882532.85567: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-f6eb-935c-00000000002b] 18445 1726882532.85577: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002b 18445 1726882532.85686: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002b 18445 1726882532.85696: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882532.85746: no more pending results, returning what we have 18445 1726882532.85750: results queue empty 18445 1726882532.85751: checking for any_errors_fatal 18445 1726882532.85758: done checking for any_errors_fatal 18445 1726882532.85759: checking for max_fail_percentage 18445 1726882532.85761: done checking for max_fail_percentage 18445 1726882532.85762: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.85765: done checking to see if all hosts have failed 18445 1726882532.85766: getting the remaining hosts for this loop 18445 1726882532.85767: done getting the remaining hosts for this loop 18445 1726882532.85771: getting the next task for host managed_node1 18445 1726882532.85777: done getting next task for host managed_node1 18445 1726882532.85781: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18445 1726882532.85783: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.85795: getting variables 18445 1726882532.85797: in VariableManager get_vars() 18445 1726882532.85834: Calling all_inventory to load vars for managed_node1 18445 1726882532.85837: Calling groups_inventory to load vars for managed_node1 18445 1726882532.85840: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.85850: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.85852: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.85858: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.86038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.86261: done with get_vars() 18445 1726882532.86274: done getting variables 18445 1726882532.86337: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:32 -0400 (0:00:00.056) 0:00:04.802 ****** 18445 1726882532.86522: entering _queue_task() for managed_node1/debug 18445 1726882532.86876: worker is 1 (out of 1 available) 18445 1726882532.86886: exiting _queue_task() for managed_node1/debug 18445 1726882532.86897: done queuing things up, now waiting for results queue to drain 18445 1726882532.86898: waiting for pending results... 18445 1726882532.87162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18445 1726882532.87257: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000002c 18445 1726882532.87282: variable 'ansible_search_path' from source: unknown 18445 1726882532.87289: variable 'ansible_search_path' from source: unknown 18445 1726882532.87326: calling self._execute() 18445 1726882532.87408: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.87421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.87433: variable 'omit' from source: magic vars 18445 1726882532.87861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.91486: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.91559: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.91602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.91646: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.91681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.91785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.91818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.91846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.91903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.91925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.92068: variable 'ansible_distribution' from source: facts 18445 1726882532.92085: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.92105: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.92112: when evaluation is False, skipping this task 18445 1726882532.92117: _execute() done 18445 1726882532.92122: dumping result to json 18445 1726882532.92128: done dumping result, returning 18445 1726882532.92139: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-f6eb-935c-00000000002c] 18445 1726882532.92148: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002c skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882532.92293: no more pending results, returning what we have 18445 1726882532.92296: results queue empty 18445 1726882532.92297: checking for any_errors_fatal 18445 1726882532.92302: done checking for any_errors_fatal 18445 1726882532.92303: checking for max_fail_percentage 18445 1726882532.92305: done checking for max_fail_percentage 18445 1726882532.92306: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.92307: done checking to see if all hosts have failed 18445 1726882532.92308: getting the remaining hosts for this loop 18445 1726882532.92310: done getting the remaining hosts for this loop 18445 1726882532.92313: getting the next task for host managed_node1 18445 1726882532.92319: done getting next task for host managed_node1 18445 1726882532.92324: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18445 1726882532.92326: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.92338: getting variables 18445 1726882532.92339: in VariableManager get_vars() 18445 1726882532.92382: Calling all_inventory to load vars for managed_node1 18445 1726882532.92385: Calling groups_inventory to load vars for managed_node1 18445 1726882532.92387: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.92397: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.92401: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.92405: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.92629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.92843: done with get_vars() 18445 1726882532.92857: done getting variables 18445 1726882532.93001: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002c 18445 1726882532.93004: WORKER PROCESS EXITING 18445 1726882532.93039: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:32 -0400 (0:00:00.065) 0:00:04.867 ****** 18445 1726882532.93078: entering _queue_task() for managed_node1/debug 18445 1726882532.93476: worker is 1 (out of 1 available) 18445 1726882532.93487: exiting _queue_task() for managed_node1/debug 18445 1726882532.93498: done queuing things up, now waiting for results queue to drain 18445 1726882532.93499: waiting for pending results... 18445 1726882532.93756: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18445 1726882532.93849: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000002d 18445 1726882532.93874: variable 'ansible_search_path' from source: unknown 18445 1726882532.93881: variable 'ansible_search_path' from source: unknown 18445 1726882532.93915: calling self._execute() 18445 1726882532.93999: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.94010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.94021: variable 'omit' from source: magic vars 18445 1726882532.94449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882532.98109: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882532.98187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882532.98240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882532.98283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882532.98317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882532.98407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882532.98462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882532.98497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882532.98527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882532.98538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882532.98656: variable 'ansible_distribution' from source: facts 18445 1726882532.98667: variable 'ansible_distribution_major_version' from source: facts 18445 1726882532.98685: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882532.98687: when evaluation is False, skipping this task 18445 1726882532.98690: _execute() done 18445 1726882532.98692: dumping result to json 18445 1726882532.98694: done dumping result, returning 18445 1726882532.98701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-f6eb-935c-00000000002d] 18445 1726882532.98706: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002d 18445 1726882532.98793: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002d 18445 1726882532.98796: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882532.98871: no more pending results, returning what we have 18445 1726882532.98875: results queue empty 18445 1726882532.98876: checking for any_errors_fatal 18445 1726882532.98881: done checking for any_errors_fatal 18445 1726882532.98882: checking for max_fail_percentage 18445 1726882532.98883: done checking for max_fail_percentage 18445 1726882532.98884: checking to see if all hosts have failed and the running result is not ok 18445 1726882532.98886: done checking to see if all hosts have failed 18445 1726882532.98887: getting the remaining hosts for this loop 18445 1726882532.98888: done getting the remaining hosts for this loop 18445 1726882532.98892: getting the next task for host managed_node1 18445 1726882532.98897: done getting next task for host managed_node1 18445 1726882532.98901: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18445 1726882532.98903: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882532.98914: getting variables 18445 1726882532.98915: in VariableManager get_vars() 18445 1726882532.98946: Calling all_inventory to load vars for managed_node1 18445 1726882532.98949: Calling groups_inventory to load vars for managed_node1 18445 1726882532.98951: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882532.98961: Calling all_plugins_play to load vars for managed_node1 18445 1726882532.98965: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882532.98968: Calling groups_plugins_play to load vars for managed_node1 18445 1726882532.99079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882532.99199: done with get_vars() 18445 1726882532.99206: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:32 -0400 (0:00:00.061) 0:00:04.929 ****** 18445 1726882532.99273: entering _queue_task() for managed_node1/ping 18445 1726882532.99274: Creating lock for ping 18445 1726882532.99462: worker is 1 (out of 1 available) 18445 1726882532.99477: exiting _queue_task() for managed_node1/ping 18445 1726882532.99487: done queuing things up, now waiting for results queue to drain 18445 1726882532.99488: waiting for pending results... 18445 1726882532.99647: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18445 1726882532.99703: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000002e 18445 1726882532.99714: variable 'ansible_search_path' from source: unknown 18445 1726882532.99718: variable 'ansible_search_path' from source: unknown 18445 1726882532.99745: calling self._execute() 18445 1726882532.99811: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882532.99823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882532.99835: variable 'omit' from source: magic vars 18445 1726882533.00200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.03691: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.03759: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.03804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.03843: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.03889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.03974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.04009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.04041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.04088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.04109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.04240: variable 'ansible_distribution' from source: facts 18445 1726882533.04250: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.04274: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.04282: when evaluation is False, skipping this task 18445 1726882533.04288: _execute() done 18445 1726882533.04295: dumping result to json 18445 1726882533.04303: done dumping result, returning 18445 1726882533.04314: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-f6eb-935c-00000000002e] 18445 1726882533.04324: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.04460: no more pending results, returning what we have 18445 1726882533.04465: results queue empty 18445 1726882533.04466: checking for any_errors_fatal 18445 1726882533.04471: done checking for any_errors_fatal 18445 1726882533.04472: checking for max_fail_percentage 18445 1726882533.04474: done checking for max_fail_percentage 18445 1726882533.04475: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.04475: done checking to see if all hosts have failed 18445 1726882533.04476: getting the remaining hosts for this loop 18445 1726882533.04478: done getting the remaining hosts for this loop 18445 1726882533.04481: getting the next task for host managed_node1 18445 1726882533.04488: done getting next task for host managed_node1 18445 1726882533.04490: ^ task is: TASK: meta (role_complete) 18445 1726882533.04493: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.04506: getting variables 18445 1726882533.04508: in VariableManager get_vars() 18445 1726882533.04542: Calling all_inventory to load vars for managed_node1 18445 1726882533.04544: Calling groups_inventory to load vars for managed_node1 18445 1726882533.04547: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.04558: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.04560: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.04564: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.04777: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000002e 18445 1726882533.04781: WORKER PROCESS EXITING 18445 1726882533.04808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.05040: done with get_vars() 18445 1726882533.05050: done getting variables 18445 1726882533.05137: done queuing things up, now waiting for results queue to drain 18445 1726882533.05139: results queue empty 18445 1726882533.05140: checking for any_errors_fatal 18445 1726882533.05142: done checking for any_errors_fatal 18445 1726882533.05142: checking for max_fail_percentage 18445 1726882533.05143: done checking for max_fail_percentage 18445 1726882533.05144: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.05145: done checking to see if all hosts have failed 18445 1726882533.05145: getting the remaining hosts for this loop 18445 1726882533.05146: done getting the remaining hosts for this loop 18445 1726882533.05149: getting the next task for host managed_node1 18445 1726882533.05151: done getting next task for host managed_node1 18445 1726882533.05156: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18445 1726882533.05158: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.05160: getting variables 18445 1726882533.05161: in VariableManager get_vars() 18445 1726882533.05174: Calling all_inventory to load vars for managed_node1 18445 1726882533.05176: Calling groups_inventory to load vars for managed_node1 18445 1726882533.05178: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.05183: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.05185: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.05188: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.05347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.05570: done with get_vars() 18445 1726882533.05578: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 21:35:33 -0400 (0:00:00.063) 0:00:04.993 ****** 18445 1726882533.05646: entering _queue_task() for managed_node1/include_tasks 18445 1726882533.05925: worker is 1 (out of 1 available) 18445 1726882533.05937: exiting _queue_task() for managed_node1/include_tasks 18445 1726882533.05948: done queuing things up, now waiting for results queue to drain 18445 1726882533.05949: waiting for pending results... 18445 1726882533.06230: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18445 1726882533.06332: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000030 18445 1726882533.06351: variable 'ansible_search_path' from source: unknown 18445 1726882533.06397: calling self._execute() 18445 1726882533.06482: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.06495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.06512: variable 'omit' from source: magic vars 18445 1726882533.06975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.10549: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.10632: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.10691: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.10747: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.10791: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.10907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.10959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.11001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.11069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.11093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.11798: variable 'ansible_distribution' from source: facts 18445 1726882533.11801: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.11804: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.11806: when evaluation is False, skipping this task 18445 1726882533.11808: _execute() done 18445 1726882533.11810: dumping result to json 18445 1726882533.11812: done dumping result, returning 18445 1726882533.11815: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0e448fcc-3ce9-f6eb-935c-000000000030] 18445 1726882533.11817: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000030 18445 1726882533.11893: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000030 18445 1726882533.11897: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.11938: no more pending results, returning what we have 18445 1726882533.11941: results queue empty 18445 1726882533.11942: checking for any_errors_fatal 18445 1726882533.11944: done checking for any_errors_fatal 18445 1726882533.11952: checking for max_fail_percentage 18445 1726882533.11953: done checking for max_fail_percentage 18445 1726882533.11954: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.11957: done checking to see if all hosts have failed 18445 1726882533.11958: getting the remaining hosts for this loop 18445 1726882533.11960: done getting the remaining hosts for this loop 18445 1726882533.11965: getting the next task for host managed_node1 18445 1726882533.11972: done getting next task for host managed_node1 18445 1726882533.11975: ^ task is: TASK: meta (flush_handlers) 18445 1726882533.11976: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.11980: getting variables 18445 1726882533.11981: in VariableManager get_vars() 18445 1726882533.12020: Calling all_inventory to load vars for managed_node1 18445 1726882533.12023: Calling groups_inventory to load vars for managed_node1 18445 1726882533.12025: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.12034: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.12037: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.12039: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.12180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.12589: done with get_vars() 18445 1726882533.12599: done getting variables 18445 1726882533.12669: in VariableManager get_vars() 18445 1726882533.12681: Calling all_inventory to load vars for managed_node1 18445 1726882533.12683: Calling groups_inventory to load vars for managed_node1 18445 1726882533.12685: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.12689: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.12691: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.12694: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.12831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.13060: done with get_vars() 18445 1726882533.13075: done queuing things up, now waiting for results queue to drain 18445 1726882533.13077: results queue empty 18445 1726882533.13077: checking for any_errors_fatal 18445 1726882533.13079: done checking for any_errors_fatal 18445 1726882533.13080: checking for max_fail_percentage 18445 1726882533.13081: done checking for max_fail_percentage 18445 1726882533.13082: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.13083: done checking to see if all hosts have failed 18445 1726882533.13083: getting the remaining hosts for this loop 18445 1726882533.13084: done getting the remaining hosts for this loop 18445 1726882533.13086: getting the next task for host managed_node1 18445 1726882533.13090: done getting next task for host managed_node1 18445 1726882533.13091: ^ task is: TASK: meta (flush_handlers) 18445 1726882533.13092: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.13095: getting variables 18445 1726882533.13096: in VariableManager get_vars() 18445 1726882533.13105: Calling all_inventory to load vars for managed_node1 18445 1726882533.13107: Calling groups_inventory to load vars for managed_node1 18445 1726882533.13113: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.13118: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.13121: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.13123: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.13627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.13879: done with get_vars() 18445 1726882533.13887: done getting variables 18445 1726882533.13929: in VariableManager get_vars() 18445 1726882533.13940: Calling all_inventory to load vars for managed_node1 18445 1726882533.13942: Calling groups_inventory to load vars for managed_node1 18445 1726882533.13944: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.13948: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.13951: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.13953: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.14121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.14320: done with get_vars() 18445 1726882533.14331: done queuing things up, now waiting for results queue to drain 18445 1726882533.14332: results queue empty 18445 1726882533.14333: checking for any_errors_fatal 18445 1726882533.14334: done checking for any_errors_fatal 18445 1726882533.14335: checking for max_fail_percentage 18445 1726882533.14336: done checking for max_fail_percentage 18445 1726882533.14337: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.14337: done checking to see if all hosts have failed 18445 1726882533.14338: getting the remaining hosts for this loop 18445 1726882533.14339: done getting the remaining hosts for this loop 18445 1726882533.14341: getting the next task for host managed_node1 18445 1726882533.14344: done getting next task for host managed_node1 18445 1726882533.14345: ^ task is: None 18445 1726882533.14346: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.14347: done queuing things up, now waiting for results queue to drain 18445 1726882533.14348: results queue empty 18445 1726882533.14349: checking for any_errors_fatal 18445 1726882533.14349: done checking for any_errors_fatal 18445 1726882533.14350: checking for max_fail_percentage 18445 1726882533.14351: done checking for max_fail_percentage 18445 1726882533.14351: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.14352: done checking to see if all hosts have failed 18445 1726882533.14353: getting the next task for host managed_node1 18445 1726882533.14355: done getting next task for host managed_node1 18445 1726882533.14356: ^ task is: None 18445 1726882533.14357: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.14387: in VariableManager get_vars() 18445 1726882533.14407: done with get_vars() 18445 1726882533.14412: in VariableManager get_vars() 18445 1726882533.14420: done with get_vars() 18445 1726882533.14423: variable 'omit' from source: magic vars 18445 1726882533.14449: in VariableManager get_vars() 18445 1726882533.14457: done with get_vars() 18445 1726882533.14476: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 18445 1726882533.14643: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882533.14884: getting the remaining hosts for this loop 18445 1726882533.14886: done getting the remaining hosts for this loop 18445 1726882533.14888: getting the next task for host managed_node1 18445 1726882533.14891: done getting next task for host managed_node1 18445 1726882533.14893: ^ task is: TASK: Gathering Facts 18445 1726882533.14894: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.14896: getting variables 18445 1726882533.14897: in VariableManager get_vars() 18445 1726882533.14905: Calling all_inventory to load vars for managed_node1 18445 1726882533.14907: Calling groups_inventory to load vars for managed_node1 18445 1726882533.14909: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.14914: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.14916: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.14919: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.15057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.15230: done with get_vars() 18445 1726882533.15238: done getting variables 18445 1726882533.15278: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 21:35:33 -0400 (0:00:00.096) 0:00:05.090 ****** 18445 1726882533.15308: entering _queue_task() for managed_node1/gather_facts 18445 1726882533.15771: worker is 1 (out of 1 available) 18445 1726882533.15782: exiting _queue_task() for managed_node1/gather_facts 18445 1726882533.15798: done queuing things up, now waiting for results queue to drain 18445 1726882533.15799: waiting for pending results... 18445 1726882533.16072: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882533.16171: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000197 18445 1726882533.16192: variable 'ansible_search_path' from source: unknown 18445 1726882533.16242: calling self._execute() 18445 1726882533.16317: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.16328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.16351: variable 'omit' from source: magic vars 18445 1726882533.16816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.18827: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.18874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.18902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.18927: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.18945: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.19004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.19025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.19044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.19078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.19091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.19184: variable 'ansible_distribution' from source: facts 18445 1726882533.19190: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.19205: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.19208: when evaluation is False, skipping this task 18445 1726882533.19211: _execute() done 18445 1726882533.19214: dumping result to json 18445 1726882533.19216: done dumping result, returning 18445 1726882533.19220: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-000000000197] 18445 1726882533.19226: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000197 18445 1726882533.19293: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000197 18445 1726882533.19296: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.19342: no more pending results, returning what we have 18445 1726882533.19346: results queue empty 18445 1726882533.19346: checking for any_errors_fatal 18445 1726882533.19347: done checking for any_errors_fatal 18445 1726882533.19348: checking for max_fail_percentage 18445 1726882533.19349: done checking for max_fail_percentage 18445 1726882533.19350: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.19351: done checking to see if all hosts have failed 18445 1726882533.19352: getting the remaining hosts for this loop 18445 1726882533.19356: done getting the remaining hosts for this loop 18445 1726882533.19359: getting the next task for host managed_node1 18445 1726882533.19365: done getting next task for host managed_node1 18445 1726882533.19367: ^ task is: TASK: meta (flush_handlers) 18445 1726882533.19369: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.19373: getting variables 18445 1726882533.19374: in VariableManager get_vars() 18445 1726882533.19399: Calling all_inventory to load vars for managed_node1 18445 1726882533.19402: Calling groups_inventory to load vars for managed_node1 18445 1726882533.19405: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.19415: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.19418: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.19421: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.19557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.19681: done with get_vars() 18445 1726882533.19687: done getting variables 18445 1726882533.19730: in VariableManager get_vars() 18445 1726882533.19736: Calling all_inventory to load vars for managed_node1 18445 1726882533.19738: Calling groups_inventory to load vars for managed_node1 18445 1726882533.19740: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.19743: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.19745: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.19748: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.19850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.19967: done with get_vars() 18445 1726882533.19976: done queuing things up, now waiting for results queue to drain 18445 1726882533.19978: results queue empty 18445 1726882533.19979: checking for any_errors_fatal 18445 1726882533.19980: done checking for any_errors_fatal 18445 1726882533.19981: checking for max_fail_percentage 18445 1726882533.19981: done checking for max_fail_percentage 18445 1726882533.19982: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.19982: done checking to see if all hosts have failed 18445 1726882533.19983: getting the remaining hosts for this loop 18445 1726882533.19983: done getting the remaining hosts for this loop 18445 1726882533.19985: getting the next task for host managed_node1 18445 1726882533.19987: done getting next task for host managed_node1 18445 1726882533.19989: ^ task is: TASK: Show network_provider 18445 1726882533.19989: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.19991: getting variables 18445 1726882533.19991: in VariableManager get_vars() 18445 1726882533.19996: Calling all_inventory to load vars for managed_node1 18445 1726882533.19997: Calling groups_inventory to load vars for managed_node1 18445 1726882533.19999: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.20002: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.20007: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.20008: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.20088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.20200: done with get_vars() 18445 1726882533.20206: done getting variables 18445 1726882533.20231: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 21:35:33 -0400 (0:00:00.049) 0:00:05.139 ****** 18445 1726882533.20248: entering _queue_task() for managed_node1/debug 18445 1726882533.20419: worker is 1 (out of 1 available) 18445 1726882533.20431: exiting _queue_task() for managed_node1/debug 18445 1726882533.20443: done queuing things up, now waiting for results queue to drain 18445 1726882533.20445: waiting for pending results... 18445 1726882533.20597: running TaskExecutor() for managed_node1/TASK: Show network_provider 18445 1726882533.20653: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000033 18445 1726882533.20665: variable 'ansible_search_path' from source: unknown 18445 1726882533.20693: calling self._execute() 18445 1726882533.20742: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.20752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.20762: variable 'omit' from source: magic vars 18445 1726882533.21067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.23181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.23249: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.23289: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.23324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.23349: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.23427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.23459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.23491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.23531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.23542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.23639: variable 'ansible_distribution' from source: facts 18445 1726882533.23644: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.23662: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.23666: when evaluation is False, skipping this task 18445 1726882533.23669: _execute() done 18445 1726882533.23671: dumping result to json 18445 1726882533.23673: done dumping result, returning 18445 1726882533.23679: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0e448fcc-3ce9-f6eb-935c-000000000033] 18445 1726882533.23684: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000033 18445 1726882533.23756: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000033 18445 1726882533.23759: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882533.23799: no more pending results, returning what we have 18445 1726882533.23802: results queue empty 18445 1726882533.23803: checking for any_errors_fatal 18445 1726882533.23805: done checking for any_errors_fatal 18445 1726882533.23806: checking for max_fail_percentage 18445 1726882533.23807: done checking for max_fail_percentage 18445 1726882533.23808: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.23809: done checking to see if all hosts have failed 18445 1726882533.23810: getting the remaining hosts for this loop 18445 1726882533.23811: done getting the remaining hosts for this loop 18445 1726882533.23814: getting the next task for host managed_node1 18445 1726882533.23820: done getting next task for host managed_node1 18445 1726882533.23822: ^ task is: TASK: meta (flush_handlers) 18445 1726882533.23824: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.23828: getting variables 18445 1726882533.23829: in VariableManager get_vars() 18445 1726882533.23851: Calling all_inventory to load vars for managed_node1 18445 1726882533.23854: Calling groups_inventory to load vars for managed_node1 18445 1726882533.23856: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.23872: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.23875: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.23878: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.23992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.24103: done with get_vars() 18445 1726882533.24110: done getting variables 18445 1726882533.24151: in VariableManager get_vars() 18445 1726882533.24157: Calling all_inventory to load vars for managed_node1 18445 1726882533.24159: Calling groups_inventory to load vars for managed_node1 18445 1726882533.24160: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.24165: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.24166: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.24168: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.24276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.24379: done with get_vars() 18445 1726882533.24387: done queuing things up, now waiting for results queue to drain 18445 1726882533.24388: results queue empty 18445 1726882533.24388: checking for any_errors_fatal 18445 1726882533.24390: done checking for any_errors_fatal 18445 1726882533.24390: checking for max_fail_percentage 18445 1726882533.24391: done checking for max_fail_percentage 18445 1726882533.24391: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.24392: done checking to see if all hosts have failed 18445 1726882533.24392: getting the remaining hosts for this loop 18445 1726882533.24393: done getting the remaining hosts for this loop 18445 1726882533.24394: getting the next task for host managed_node1 18445 1726882533.24396: done getting next task for host managed_node1 18445 1726882533.24397: ^ task is: TASK: meta (flush_handlers) 18445 1726882533.24398: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.24400: getting variables 18445 1726882533.24400: in VariableManager get_vars() 18445 1726882533.24405: Calling all_inventory to load vars for managed_node1 18445 1726882533.24406: Calling groups_inventory to load vars for managed_node1 18445 1726882533.24407: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.24411: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.24417: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.24419: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.24494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.24598: done with get_vars() 18445 1726882533.24603: done getting variables 18445 1726882533.24632: in VariableManager get_vars() 18445 1726882533.24638: Calling all_inventory to load vars for managed_node1 18445 1726882533.24639: Calling groups_inventory to load vars for managed_node1 18445 1726882533.24641: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.24643: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.24645: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.24646: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.24723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.24828: done with get_vars() 18445 1726882533.24835: done queuing things up, now waiting for results queue to drain 18445 1726882533.24836: results queue empty 18445 1726882533.24836: checking for any_errors_fatal 18445 1726882533.24837: done checking for any_errors_fatal 18445 1726882533.24838: checking for max_fail_percentage 18445 1726882533.24838: done checking for max_fail_percentage 18445 1726882533.24839: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.24839: done checking to see if all hosts have failed 18445 1726882533.24840: getting the remaining hosts for this loop 18445 1726882533.24840: done getting the remaining hosts for this loop 18445 1726882533.24842: getting the next task for host managed_node1 18445 1726882533.24844: done getting next task for host managed_node1 18445 1726882533.24845: ^ task is: None 18445 1726882533.24846: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.24847: done queuing things up, now waiting for results queue to drain 18445 1726882533.24848: results queue empty 18445 1726882533.24848: checking for any_errors_fatal 18445 1726882533.24849: done checking for any_errors_fatal 18445 1726882533.24849: checking for max_fail_percentage 18445 1726882533.24850: done checking for max_fail_percentage 18445 1726882533.24850: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.24851: done checking to see if all hosts have failed 18445 1726882533.24851: getting the next task for host managed_node1 18445 1726882533.24853: done getting next task for host managed_node1 18445 1726882533.24853: ^ task is: None 18445 1726882533.24854: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.24879: in VariableManager get_vars() 18445 1726882533.24893: done with get_vars() 18445 1726882533.24897: in VariableManager get_vars() 18445 1726882533.24904: done with get_vars() 18445 1726882533.24907: variable 'omit' from source: magic vars 18445 1726882533.24989: variable 'profile' from source: play vars 18445 1726882533.25091: in VariableManager get_vars() 18445 1726882533.25100: done with get_vars() 18445 1726882533.25113: variable 'omit' from source: magic vars 18445 1726882533.25155: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 18445 1726882533.25540: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882533.25559: getting the remaining hosts for this loop 18445 1726882533.25560: done getting the remaining hosts for this loop 18445 1726882533.25562: getting the next task for host managed_node1 18445 1726882533.25566: done getting next task for host managed_node1 18445 1726882533.25567: ^ task is: TASK: Gathering Facts 18445 1726882533.25568: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.25569: getting variables 18445 1726882533.25570: in VariableManager get_vars() 18445 1726882533.25577: Calling all_inventory to load vars for managed_node1 18445 1726882533.25578: Calling groups_inventory to load vars for managed_node1 18445 1726882533.25579: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.25583: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.25584: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.25586: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.25665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.25768: done with get_vars() 18445 1726882533.25773: done getting variables 18445 1726882533.25797: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:35:33 -0400 (0:00:00.055) 0:00:05.195 ****** 18445 1726882533.25812: entering _queue_task() for managed_node1/gather_facts 18445 1726882533.26107: worker is 1 (out of 1 available) 18445 1726882533.26122: exiting _queue_task() for managed_node1/gather_facts 18445 1726882533.26133: done queuing things up, now waiting for results queue to drain 18445 1726882533.26135: waiting for pending results... 18445 1726882533.26364: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882533.26422: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000001ac 18445 1726882533.26433: variable 'ansible_search_path' from source: unknown 18445 1726882533.26466: calling self._execute() 18445 1726882533.26571: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.26575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.26584: variable 'omit' from source: magic vars 18445 1726882533.26876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.28877: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.28951: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.28994: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.29031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.29062: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.29142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.29182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.29212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.29259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.29286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.29419: variable 'ansible_distribution' from source: facts 18445 1726882533.29430: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.29450: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.29460: when evaluation is False, skipping this task 18445 1726882533.29469: _execute() done 18445 1726882533.29475: dumping result to json 18445 1726882533.29482: done dumping result, returning 18445 1726882533.29491: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-0000000001ac] 18445 1726882533.29500: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000001ac 18445 1726882533.29588: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000001ac 18445 1726882533.29594: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.29696: no more pending results, returning what we have 18445 1726882533.29701: results queue empty 18445 1726882533.29702: checking for any_errors_fatal 18445 1726882533.29703: done checking for any_errors_fatal 18445 1726882533.29704: checking for max_fail_percentage 18445 1726882533.29705: done checking for max_fail_percentage 18445 1726882533.29706: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.29707: done checking to see if all hosts have failed 18445 1726882533.29708: getting the remaining hosts for this loop 18445 1726882533.29709: done getting the remaining hosts for this loop 18445 1726882533.29712: getting the next task for host managed_node1 18445 1726882533.29717: done getting next task for host managed_node1 18445 1726882533.29719: ^ task is: TASK: meta (flush_handlers) 18445 1726882533.29721: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.29723: getting variables 18445 1726882533.29725: in VariableManager get_vars() 18445 1726882533.29754: Calling all_inventory to load vars for managed_node1 18445 1726882533.29756: Calling groups_inventory to load vars for managed_node1 18445 1726882533.29758: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.29769: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.29772: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.29775: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.29943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.30154: done with get_vars() 18445 1726882533.30166: done getting variables 18445 1726882533.30235: in VariableManager get_vars() 18445 1726882533.30246: Calling all_inventory to load vars for managed_node1 18445 1726882533.30248: Calling groups_inventory to load vars for managed_node1 18445 1726882533.30250: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.30254: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.30257: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.30260: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.30400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.30622: done with get_vars() 18445 1726882533.30641: done queuing things up, now waiting for results queue to drain 18445 1726882533.30651: results queue empty 18445 1726882533.30652: checking for any_errors_fatal 18445 1726882533.30654: done checking for any_errors_fatal 18445 1726882533.30654: checking for max_fail_percentage 18445 1726882533.30655: done checking for max_fail_percentage 18445 1726882533.30656: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.30657: done checking to see if all hosts have failed 18445 1726882533.30657: getting the remaining hosts for this loop 18445 1726882533.30658: done getting the remaining hosts for this loop 18445 1726882533.30661: getting the next task for host managed_node1 18445 1726882533.30667: done getting next task for host managed_node1 18445 1726882533.30680: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18445 1726882533.30681: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.30691: getting variables 18445 1726882533.30692: in VariableManager get_vars() 18445 1726882533.30704: Calling all_inventory to load vars for managed_node1 18445 1726882533.30706: Calling groups_inventory to load vars for managed_node1 18445 1726882533.30708: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.30716: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.30718: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.30727: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.30892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.31085: done with get_vars() 18445 1726882533.31092: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:33 -0400 (0:00:00.053) 0:00:05.248 ****** 18445 1726882533.31157: entering _queue_task() for managed_node1/include_tasks 18445 1726882533.31485: worker is 1 (out of 1 available) 18445 1726882533.31507: exiting _queue_task() for managed_node1/include_tasks 18445 1726882533.31517: done queuing things up, now waiting for results queue to drain 18445 1726882533.31519: waiting for pending results... 18445 1726882533.31789: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18445 1726882533.31850: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000003c 18445 1726882533.31865: variable 'ansible_search_path' from source: unknown 18445 1726882533.31869: variable 'ansible_search_path' from source: unknown 18445 1726882533.31899: calling self._execute() 18445 1726882533.31954: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.31964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.31973: variable 'omit' from source: magic vars 18445 1726882533.32253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.33926: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.33994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.34033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.34085: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.34114: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.34195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.34228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.34260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.34312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.34331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.34461: variable 'ansible_distribution' from source: facts 18445 1726882533.34474: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.34495: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.34501: when evaluation is False, skipping this task 18445 1726882533.34507: _execute() done 18445 1726882533.34513: dumping result to json 18445 1726882533.34520: done dumping result, returning 18445 1726882533.34529: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-f6eb-935c-00000000003c] 18445 1726882533.34538: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003c 18445 1726882533.34633: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003c 18445 1726882533.34639: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.34691: no more pending results, returning what we have 18445 1726882533.34694: results queue empty 18445 1726882533.34695: checking for any_errors_fatal 18445 1726882533.34697: done checking for any_errors_fatal 18445 1726882533.34697: checking for max_fail_percentage 18445 1726882533.34699: done checking for max_fail_percentage 18445 1726882533.34700: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.34701: done checking to see if all hosts have failed 18445 1726882533.34701: getting the remaining hosts for this loop 18445 1726882533.34702: done getting the remaining hosts for this loop 18445 1726882533.34706: getting the next task for host managed_node1 18445 1726882533.34711: done getting next task for host managed_node1 18445 1726882533.34714: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18445 1726882533.34716: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.34727: getting variables 18445 1726882533.34729: in VariableManager get_vars() 18445 1726882533.34763: Calling all_inventory to load vars for managed_node1 18445 1726882533.34767: Calling groups_inventory to load vars for managed_node1 18445 1726882533.34769: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.34777: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.34779: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.34782: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.34945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.35159: done with get_vars() 18445 1726882533.35170: done getting variables 18445 1726882533.35230: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:33 -0400 (0:00:00.040) 0:00:05.289 ****** 18445 1726882533.35258: entering _queue_task() for managed_node1/debug 18445 1726882533.35488: worker is 1 (out of 1 available) 18445 1726882533.35499: exiting _queue_task() for managed_node1/debug 18445 1726882533.35509: done queuing things up, now waiting for results queue to drain 18445 1726882533.35511: waiting for pending results... 18445 1726882533.35775: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18445 1726882533.35878: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000003d 18445 1726882533.35895: variable 'ansible_search_path' from source: unknown 18445 1726882533.35901: variable 'ansible_search_path' from source: unknown 18445 1726882533.35937: calling self._execute() 18445 1726882533.36017: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.36027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.36039: variable 'omit' from source: magic vars 18445 1726882533.36539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.39687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.39762: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.39810: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.39848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.39881: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.39967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.40009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.40039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.40086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.40115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.40258: variable 'ansible_distribution' from source: facts 18445 1726882533.40272: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.40293: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.40299: when evaluation is False, skipping this task 18445 1726882533.40305: _execute() done 18445 1726882533.40310: dumping result to json 18445 1726882533.40316: done dumping result, returning 18445 1726882533.40335: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-f6eb-935c-00000000003d] 18445 1726882533.40345: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003d skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882533.40472: no more pending results, returning what we have 18445 1726882533.40475: results queue empty 18445 1726882533.40476: checking for any_errors_fatal 18445 1726882533.40481: done checking for any_errors_fatal 18445 1726882533.40482: checking for max_fail_percentage 18445 1726882533.40483: done checking for max_fail_percentage 18445 1726882533.40484: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.40485: done checking to see if all hosts have failed 18445 1726882533.40486: getting the remaining hosts for this loop 18445 1726882533.40488: done getting the remaining hosts for this loop 18445 1726882533.40491: getting the next task for host managed_node1 18445 1726882533.40497: done getting next task for host managed_node1 18445 1726882533.40500: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18445 1726882533.40502: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.40514: getting variables 18445 1726882533.40516: in VariableManager get_vars() 18445 1726882533.40605: Calling all_inventory to load vars for managed_node1 18445 1726882533.40609: Calling groups_inventory to load vars for managed_node1 18445 1726882533.40612: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.40621: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.40624: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.40627: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.40800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.41018: done with get_vars() 18445 1726882533.41028: done getting variables 18445 1726882533.41098: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882533.41127: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003d 18445 1726882533.41130: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:33 -0400 (0:00:00.058) 0:00:05.348 ****** 18445 1726882533.41144: entering _queue_task() for managed_node1/fail 18445 1726882533.41862: worker is 1 (out of 1 available) 18445 1726882533.41877: exiting _queue_task() for managed_node1/fail 18445 1726882533.41890: done queuing things up, now waiting for results queue to drain 18445 1726882533.41891: waiting for pending results... 18445 1726882533.42156: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18445 1726882533.42298: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000003e 18445 1726882533.42315: variable 'ansible_search_path' from source: unknown 18445 1726882533.42322: variable 'ansible_search_path' from source: unknown 18445 1726882533.42366: calling self._execute() 18445 1726882533.42444: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.42775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.42793: variable 'omit' from source: magic vars 18445 1726882533.43198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.46407: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.46484: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.46525: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.46823: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.46853: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.46942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.46977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.47009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.47066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.47087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.47223: variable 'ansible_distribution' from source: facts 18445 1726882533.47237: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.47265: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.47273: when evaluation is False, skipping this task 18445 1726882533.47280: _execute() done 18445 1726882533.47287: dumping result to json 18445 1726882533.47295: done dumping result, returning 18445 1726882533.47306: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-f6eb-935c-00000000003e] 18445 1726882533.47316: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.47459: no more pending results, returning what we have 18445 1726882533.47465: results queue empty 18445 1726882533.47466: checking for any_errors_fatal 18445 1726882533.47471: done checking for any_errors_fatal 18445 1726882533.47472: checking for max_fail_percentage 18445 1726882533.47473: done checking for max_fail_percentage 18445 1726882533.47474: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.47475: done checking to see if all hosts have failed 18445 1726882533.47476: getting the remaining hosts for this loop 18445 1726882533.47478: done getting the remaining hosts for this loop 18445 1726882533.47482: getting the next task for host managed_node1 18445 1726882533.47488: done getting next task for host managed_node1 18445 1726882533.47491: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18445 1726882533.47494: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.47506: getting variables 18445 1726882533.47508: in VariableManager get_vars() 18445 1726882533.47547: Calling all_inventory to load vars for managed_node1 18445 1726882533.47550: Calling groups_inventory to load vars for managed_node1 18445 1726882533.47553: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.47564: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.47568: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.47572: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.47744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.47944: done with get_vars() 18445 1726882533.47955: done getting variables 18445 1726882533.48017: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:33 -0400 (0:00:00.069) 0:00:05.417 ****** 18445 1726882533.48052: entering _queue_task() for managed_node1/fail 18445 1726882533.48073: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003e 18445 1726882533.48082: WORKER PROCESS EXITING 18445 1726882533.48561: worker is 1 (out of 1 available) 18445 1726882533.48575: exiting _queue_task() for managed_node1/fail 18445 1726882533.48586: done queuing things up, now waiting for results queue to drain 18445 1726882533.48588: waiting for pending results... 18445 1726882533.48843: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18445 1726882533.48941: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000003f 18445 1726882533.48959: variable 'ansible_search_path' from source: unknown 18445 1726882533.48972: variable 'ansible_search_path' from source: unknown 18445 1726882533.49013: calling self._execute() 18445 1726882533.49167: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.49178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.49191: variable 'omit' from source: magic vars 18445 1726882533.49632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.54031: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.54351: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.54395: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.54437: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.54471: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.54555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.54591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.54623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.54675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.54696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.54830: variable 'ansible_distribution' from source: facts 18445 1726882533.54841: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.54871: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.54879: when evaluation is False, skipping this task 18445 1726882533.54885: _execute() done 18445 1726882533.54892: dumping result to json 18445 1726882533.54899: done dumping result, returning 18445 1726882533.54910: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-f6eb-935c-00000000003f] 18445 1726882533.54921: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.55067: no more pending results, returning what we have 18445 1726882533.55071: results queue empty 18445 1726882533.55072: checking for any_errors_fatal 18445 1726882533.55080: done checking for any_errors_fatal 18445 1726882533.55081: checking for max_fail_percentage 18445 1726882533.55083: done checking for max_fail_percentage 18445 1726882533.55084: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.55085: done checking to see if all hosts have failed 18445 1726882533.55086: getting the remaining hosts for this loop 18445 1726882533.55087: done getting the remaining hosts for this loop 18445 1726882533.55091: getting the next task for host managed_node1 18445 1726882533.55097: done getting next task for host managed_node1 18445 1726882533.55101: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18445 1726882533.55103: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.55116: getting variables 18445 1726882533.55118: in VariableManager get_vars() 18445 1726882533.55211: Calling all_inventory to load vars for managed_node1 18445 1726882533.55213: Calling groups_inventory to load vars for managed_node1 18445 1726882533.55216: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.55226: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.55229: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.55233: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.55394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.55592: done with get_vars() 18445 1726882533.55603: done getting variables 18445 1726882533.55929: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882533.55952: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000003f 18445 1726882533.55955: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:33 -0400 (0:00:00.079) 0:00:05.496 ****** 18445 1726882533.55970: entering _queue_task() for managed_node1/fail 18445 1726882533.56212: worker is 1 (out of 1 available) 18445 1726882533.56223: exiting _queue_task() for managed_node1/fail 18445 1726882533.56234: done queuing things up, now waiting for results queue to drain 18445 1726882533.56235: waiting for pending results... 18445 1726882533.56492: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18445 1726882533.56593: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000040 18445 1726882533.56610: variable 'ansible_search_path' from source: unknown 18445 1726882533.56617: variable 'ansible_search_path' from source: unknown 18445 1726882533.56653: calling self._execute() 18445 1726882533.56734: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.56743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.56754: variable 'omit' from source: magic vars 18445 1726882533.57144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.59738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.59803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.59845: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.59886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.59919: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.60001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.60038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.60071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.60116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.60131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.60251: variable 'ansible_distribution' from source: facts 18445 1726882533.60262: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.60288: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.60296: when evaluation is False, skipping this task 18445 1726882533.60303: _execute() done 18445 1726882533.60309: dumping result to json 18445 1726882533.60315: done dumping result, returning 18445 1726882533.60326: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-f6eb-935c-000000000040] 18445 1726882533.60336: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000040 18445 1726882533.60445: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000040 18445 1726882533.60452: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.60512: no more pending results, returning what we have 18445 1726882533.60516: results queue empty 18445 1726882533.60517: checking for any_errors_fatal 18445 1726882533.60525: done checking for any_errors_fatal 18445 1726882533.60526: checking for max_fail_percentage 18445 1726882533.60527: done checking for max_fail_percentage 18445 1726882533.60529: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.60530: done checking to see if all hosts have failed 18445 1726882533.60530: getting the remaining hosts for this loop 18445 1726882533.60532: done getting the remaining hosts for this loop 18445 1726882533.60536: getting the next task for host managed_node1 18445 1726882533.60542: done getting next task for host managed_node1 18445 1726882533.60546: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18445 1726882533.60549: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.60561: getting variables 18445 1726882533.60564: in VariableManager get_vars() 18445 1726882533.60606: Calling all_inventory to load vars for managed_node1 18445 1726882533.60609: Calling groups_inventory to load vars for managed_node1 18445 1726882533.60612: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.60622: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.60626: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.60629: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.60806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.61053: done with get_vars() 18445 1726882533.61067: done getting variables 18445 1726882533.61134: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:33 -0400 (0:00:00.051) 0:00:05.548 ****** 18445 1726882533.61171: entering _queue_task() for managed_node1/dnf 18445 1726882533.61642: worker is 1 (out of 1 available) 18445 1726882533.61654: exiting _queue_task() for managed_node1/dnf 18445 1726882533.61669: done queuing things up, now waiting for results queue to drain 18445 1726882533.61671: waiting for pending results... 18445 1726882533.61925: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18445 1726882533.62023: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000041 18445 1726882533.62042: variable 'ansible_search_path' from source: unknown 18445 1726882533.62053: variable 'ansible_search_path' from source: unknown 18445 1726882533.62095: calling self._execute() 18445 1726882533.62170: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.62182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.62196: variable 'omit' from source: magic vars 18445 1726882533.62607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.65149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.65215: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.65258: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.65297: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.65326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.65407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.65440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.65474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.65519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.65538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.65668: variable 'ansible_distribution' from source: facts 18445 1726882533.65679: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.65703: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.65710: when evaluation is False, skipping this task 18445 1726882533.65716: _execute() done 18445 1726882533.65721: dumping result to json 18445 1726882533.65728: done dumping result, returning 18445 1726882533.65738: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000041] 18445 1726882533.65748: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000041 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.65886: no more pending results, returning what we have 18445 1726882533.65890: results queue empty 18445 1726882533.65891: checking for any_errors_fatal 18445 1726882533.65897: done checking for any_errors_fatal 18445 1726882533.65898: checking for max_fail_percentage 18445 1726882533.65900: done checking for max_fail_percentage 18445 1726882533.65901: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.65902: done checking to see if all hosts have failed 18445 1726882533.65902: getting the remaining hosts for this loop 18445 1726882533.65904: done getting the remaining hosts for this loop 18445 1726882533.65907: getting the next task for host managed_node1 18445 1726882533.65914: done getting next task for host managed_node1 18445 1726882533.65917: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18445 1726882533.65919: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.65931: getting variables 18445 1726882533.65932: in VariableManager get_vars() 18445 1726882533.65969: Calling all_inventory to load vars for managed_node1 18445 1726882533.65971: Calling groups_inventory to load vars for managed_node1 18445 1726882533.65974: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.65984: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.65987: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.65989: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.66157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.66363: done with get_vars() 18445 1726882533.66375: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18445 1726882533.66450: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:33 -0400 (0:00:00.053) 0:00:05.601 ****** 18445 1726882533.66485: entering _queue_task() for managed_node1/yum 18445 1726882533.66503: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000041 18445 1726882533.66511: WORKER PROCESS EXITING 18445 1726882533.66960: worker is 1 (out of 1 available) 18445 1726882533.66972: exiting _queue_task() for managed_node1/yum 18445 1726882533.66983: done queuing things up, now waiting for results queue to drain 18445 1726882533.66984: waiting for pending results... 18445 1726882533.67262: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18445 1726882533.67385: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000042 18445 1726882533.67403: variable 'ansible_search_path' from source: unknown 18445 1726882533.67411: variable 'ansible_search_path' from source: unknown 18445 1726882533.67456: calling self._execute() 18445 1726882533.67537: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.67548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.67562: variable 'omit' from source: magic vars 18445 1726882533.68000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.71188: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.71246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.71290: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.71324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.71349: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.71437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.71470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.71500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.71541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.71556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.71754: variable 'ansible_distribution' from source: facts 18445 1726882533.71768: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.71788: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.71794: when evaluation is False, skipping this task 18445 1726882533.71800: _execute() done 18445 1726882533.71805: dumping result to json 18445 1726882533.71812: done dumping result, returning 18445 1726882533.71821: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000042] 18445 1726882533.71833: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000042 18445 1726882533.71934: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000042 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.71981: no more pending results, returning what we have 18445 1726882533.71984: results queue empty 18445 1726882533.71985: checking for any_errors_fatal 18445 1726882533.71991: done checking for any_errors_fatal 18445 1726882533.71992: checking for max_fail_percentage 18445 1726882533.71993: done checking for max_fail_percentage 18445 1726882533.71994: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.71995: done checking to see if all hosts have failed 18445 1726882533.71996: getting the remaining hosts for this loop 18445 1726882533.71997: done getting the remaining hosts for this loop 18445 1726882533.72000: getting the next task for host managed_node1 18445 1726882533.72006: done getting next task for host managed_node1 18445 1726882533.72010: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18445 1726882533.72012: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.72023: getting variables 18445 1726882533.72025: in VariableManager get_vars() 18445 1726882533.72061: Calling all_inventory to load vars for managed_node1 18445 1726882533.72065: Calling groups_inventory to load vars for managed_node1 18445 1726882533.72068: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.72079: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.72081: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.72084: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.72294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.72492: done with get_vars() 18445 1726882533.72502: done getting variables 18445 1726882533.72560: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882533.72785: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:33 -0400 (0:00:00.063) 0:00:05.665 ****** 18445 1726882533.72799: entering _queue_task() for managed_node1/fail 18445 1726882533.73196: worker is 1 (out of 1 available) 18445 1726882533.73208: exiting _queue_task() for managed_node1/fail 18445 1726882533.73220: done queuing things up, now waiting for results queue to drain 18445 1726882533.73222: waiting for pending results... 18445 1726882533.73542: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18445 1726882533.73635: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000043 18445 1726882533.73653: variable 'ansible_search_path' from source: unknown 18445 1726882533.73671: variable 'ansible_search_path' from source: unknown 18445 1726882533.73711: calling self._execute() 18445 1726882533.73842: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.73851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.73870: variable 'omit' from source: magic vars 18445 1726882533.74256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.76937: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.77012: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.77056: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.77095: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.77122: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.77205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.77237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.77273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.77317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.77336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.77457: variable 'ansible_distribution' from source: facts 18445 1726882533.77473: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.77492: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.77498: when evaluation is False, skipping this task 18445 1726882533.77503: _execute() done 18445 1726882533.77508: dumping result to json 18445 1726882533.77513: done dumping result, returning 18445 1726882533.77521: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000043] 18445 1726882533.77529: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000043 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.77668: no more pending results, returning what we have 18445 1726882533.77671: results queue empty 18445 1726882533.77672: checking for any_errors_fatal 18445 1726882533.77678: done checking for any_errors_fatal 18445 1726882533.77679: checking for max_fail_percentage 18445 1726882533.77681: done checking for max_fail_percentage 18445 1726882533.77682: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.77683: done checking to see if all hosts have failed 18445 1726882533.77683: getting the remaining hosts for this loop 18445 1726882533.77685: done getting the remaining hosts for this loop 18445 1726882533.77689: getting the next task for host managed_node1 18445 1726882533.77695: done getting next task for host managed_node1 18445 1726882533.77699: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18445 1726882533.77702: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.77714: getting variables 18445 1726882533.77716: in VariableManager get_vars() 18445 1726882533.77755: Calling all_inventory to load vars for managed_node1 18445 1726882533.77758: Calling groups_inventory to load vars for managed_node1 18445 1726882533.77761: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.77772: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.77776: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.77779: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.77950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.78147: done with get_vars() 18445 1726882533.78157: done getting variables 18445 1726882533.78219: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:33 -0400 (0:00:00.054) 0:00:05.719 ****** 18445 1726882533.78254: entering _queue_task() for managed_node1/package 18445 1726882533.78501: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000043 18445 1726882533.78505: WORKER PROCESS EXITING 18445 1726882533.78726: worker is 1 (out of 1 available) 18445 1726882533.78739: exiting _queue_task() for managed_node1/package 18445 1726882533.78751: done queuing things up, now waiting for results queue to drain 18445 1726882533.78753: waiting for pending results... 18445 1726882533.79011: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18445 1726882533.79111: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000044 18445 1726882533.79129: variable 'ansible_search_path' from source: unknown 18445 1726882533.79137: variable 'ansible_search_path' from source: unknown 18445 1726882533.79179: calling self._execute() 18445 1726882533.79253: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.79267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.79282: variable 'omit' from source: magic vars 18445 1726882533.79693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.82231: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.82295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.82338: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.82390: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.82419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.82503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.82538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.82576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.82621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.82641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.82776: variable 'ansible_distribution' from source: facts 18445 1726882533.82788: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.82810: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.82817: when evaluation is False, skipping this task 18445 1726882533.82824: _execute() done 18445 1726882533.82831: dumping result to json 18445 1726882533.82838: done dumping result, returning 18445 1726882533.82849: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-f6eb-935c-000000000044] 18445 1726882533.82859: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000044 18445 1726882533.82961: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000044 18445 1726882533.82971: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.83029: no more pending results, returning what we have 18445 1726882533.83033: results queue empty 18445 1726882533.83034: checking for any_errors_fatal 18445 1726882533.83039: done checking for any_errors_fatal 18445 1726882533.83040: checking for max_fail_percentage 18445 1726882533.83041: done checking for max_fail_percentage 18445 1726882533.83043: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.83044: done checking to see if all hosts have failed 18445 1726882533.83044: getting the remaining hosts for this loop 18445 1726882533.83046: done getting the remaining hosts for this loop 18445 1726882533.83049: getting the next task for host managed_node1 18445 1726882533.83055: done getting next task for host managed_node1 18445 1726882533.83059: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18445 1726882533.83062: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.83077: getting variables 18445 1726882533.83079: in VariableManager get_vars() 18445 1726882533.83115: Calling all_inventory to load vars for managed_node1 18445 1726882533.83118: Calling groups_inventory to load vars for managed_node1 18445 1726882533.83121: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.83131: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.83134: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.83137: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.83529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.83924: done with get_vars() 18445 1726882533.83932: done getting variables 18445 1726882533.83988: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:33 -0400 (0:00:00.057) 0:00:05.777 ****** 18445 1726882533.84015: entering _queue_task() for managed_node1/package 18445 1726882533.84245: worker is 1 (out of 1 available) 18445 1726882533.84256: exiting _queue_task() for managed_node1/package 18445 1726882533.84270: done queuing things up, now waiting for results queue to drain 18445 1726882533.84271: waiting for pending results... 18445 1726882533.84518: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18445 1726882533.84615: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000045 18445 1726882533.84633: variable 'ansible_search_path' from source: unknown 18445 1726882533.84642: variable 'ansible_search_path' from source: unknown 18445 1726882533.84681: calling self._execute() 18445 1726882533.84758: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.84773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.84788: variable 'omit' from source: magic vars 18445 1726882533.85198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.87500: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.87568: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.87606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.87646: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.87677: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.87754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.87790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.87819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.87868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.87887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.88018: variable 'ansible_distribution' from source: facts 18445 1726882533.88029: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.88048: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.88055: when evaluation is False, skipping this task 18445 1726882533.88062: _execute() done 18445 1726882533.88072: dumping result to json 18445 1726882533.88079: done dumping result, returning 18445 1726882533.88089: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-f6eb-935c-000000000045] 18445 1726882533.88099: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000045 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.88227: no more pending results, returning what we have 18445 1726882533.88230: results queue empty 18445 1726882533.88231: checking for any_errors_fatal 18445 1726882533.88238: done checking for any_errors_fatal 18445 1726882533.88238: checking for max_fail_percentage 18445 1726882533.88240: done checking for max_fail_percentage 18445 1726882533.88241: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.88242: done checking to see if all hosts have failed 18445 1726882533.88243: getting the remaining hosts for this loop 18445 1726882533.88245: done getting the remaining hosts for this loop 18445 1726882533.88249: getting the next task for host managed_node1 18445 1726882533.88255: done getting next task for host managed_node1 18445 1726882533.88258: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18445 1726882533.88261: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.88275: getting variables 18445 1726882533.88278: in VariableManager get_vars() 18445 1726882533.88314: Calling all_inventory to load vars for managed_node1 18445 1726882533.88318: Calling groups_inventory to load vars for managed_node1 18445 1726882533.88321: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.88330: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.88333: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.88336: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.88506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.88714: done with get_vars() 18445 1726882533.88725: done getting variables 18445 1726882533.88786: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:33 -0400 (0:00:00.048) 0:00:05.825 ****** 18445 1726882533.88820: entering _queue_task() for managed_node1/package 18445 1726882533.88838: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000045 18445 1726882533.88846: WORKER PROCESS EXITING 18445 1726882533.89236: worker is 1 (out of 1 available) 18445 1726882533.89248: exiting _queue_task() for managed_node1/package 18445 1726882533.89258: done queuing things up, now waiting for results queue to drain 18445 1726882533.89260: waiting for pending results... 18445 1726882533.89524: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18445 1726882533.89624: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000046 18445 1726882533.89643: variable 'ansible_search_path' from source: unknown 18445 1726882533.89651: variable 'ansible_search_path' from source: unknown 18445 1726882533.89693: calling self._execute() 18445 1726882533.89777: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.89788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.89801: variable 'omit' from source: magic vars 18445 1726882533.90221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.92563: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.92628: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.92677: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.92724: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.92750: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.92827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.92856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.92891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.92933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.92948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.93073: variable 'ansible_distribution' from source: facts 18445 1726882533.93087: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.93109: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.93116: when evaluation is False, skipping this task 18445 1726882533.93122: _execute() done 18445 1726882533.93128: dumping result to json 18445 1726882533.93134: done dumping result, returning 18445 1726882533.93142: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-f6eb-935c-000000000046] 18445 1726882533.93150: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000046 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.93280: no more pending results, returning what we have 18445 1726882533.93284: results queue empty 18445 1726882533.93285: checking for any_errors_fatal 18445 1726882533.93290: done checking for any_errors_fatal 18445 1726882533.93291: checking for max_fail_percentage 18445 1726882533.93293: done checking for max_fail_percentage 18445 1726882533.93294: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.93295: done checking to see if all hosts have failed 18445 1726882533.93296: getting the remaining hosts for this loop 18445 1726882533.93298: done getting the remaining hosts for this loop 18445 1726882533.93301: getting the next task for host managed_node1 18445 1726882533.93307: done getting next task for host managed_node1 18445 1726882533.93310: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18445 1726882533.93312: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.93324: getting variables 18445 1726882533.93326: in VariableManager get_vars() 18445 1726882533.93360: Calling all_inventory to load vars for managed_node1 18445 1726882533.93362: Calling groups_inventory to load vars for managed_node1 18445 1726882533.93366: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.93375: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.93378: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.93381: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.93587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.93783: done with get_vars() 18445 1726882533.93793: done getting variables 18445 1726882533.93848: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882533.93980: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000046 18445 1726882533.93983: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:33 -0400 (0:00:00.051) 0:00:05.876 ****** 18445 1726882533.93998: entering _queue_task() for managed_node1/service 18445 1726882533.94318: worker is 1 (out of 1 available) 18445 1726882533.94329: exiting _queue_task() for managed_node1/service 18445 1726882533.94339: done queuing things up, now waiting for results queue to drain 18445 1726882533.94340: waiting for pending results... 18445 1726882533.94592: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18445 1726882533.94695: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000047 18445 1726882533.94714: variable 'ansible_search_path' from source: unknown 18445 1726882533.94722: variable 'ansible_search_path' from source: unknown 18445 1726882533.94761: calling self._execute() 18445 1726882533.94841: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.94851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.94868: variable 'omit' from source: magic vars 18445 1726882533.95272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882533.97571: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882533.97639: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882533.97679: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882533.97720: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882533.97749: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882533.97829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882533.97861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882533.97896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882533.97944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882533.97963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882533.98099: variable 'ansible_distribution' from source: facts 18445 1726882533.98110: variable 'ansible_distribution_major_version' from source: facts 18445 1726882533.98130: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882533.98137: when evaluation is False, skipping this task 18445 1726882533.98147: _execute() done 18445 1726882533.98153: dumping result to json 18445 1726882533.98160: done dumping result, returning 18445 1726882533.98171: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000047] 18445 1726882533.98180: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000047 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882533.98308: no more pending results, returning what we have 18445 1726882533.98311: results queue empty 18445 1726882533.98311: checking for any_errors_fatal 18445 1726882533.98319: done checking for any_errors_fatal 18445 1726882533.98320: checking for max_fail_percentage 18445 1726882533.98321: done checking for max_fail_percentage 18445 1726882533.98322: checking to see if all hosts have failed and the running result is not ok 18445 1726882533.98323: done checking to see if all hosts have failed 18445 1726882533.98324: getting the remaining hosts for this loop 18445 1726882533.98326: done getting the remaining hosts for this loop 18445 1726882533.98330: getting the next task for host managed_node1 18445 1726882533.98335: done getting next task for host managed_node1 18445 1726882533.98339: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18445 1726882533.98341: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882533.98353: getting variables 18445 1726882533.98355: in VariableManager get_vars() 18445 1726882533.98391: Calling all_inventory to load vars for managed_node1 18445 1726882533.98394: Calling groups_inventory to load vars for managed_node1 18445 1726882533.98396: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882533.98406: Calling all_plugins_play to load vars for managed_node1 18445 1726882533.98409: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882533.98412: Calling groups_plugins_play to load vars for managed_node1 18445 1726882533.98576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882533.98762: done with get_vars() 18445 1726882533.98776: done getting variables 18445 1726882533.98835: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:33 -0400 (0:00:00.048) 0:00:05.925 ****** 18445 1726882533.98870: entering _queue_task() for managed_node1/service 18445 1726882533.98888: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000047 18445 1726882533.98897: WORKER PROCESS EXITING 18445 1726882533.99309: worker is 1 (out of 1 available) 18445 1726882533.99321: exiting _queue_task() for managed_node1/service 18445 1726882533.99331: done queuing things up, now waiting for results queue to drain 18445 1726882533.99333: waiting for pending results... 18445 1726882533.99588: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18445 1726882533.99683: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000048 18445 1726882533.99702: variable 'ansible_search_path' from source: unknown 18445 1726882533.99711: variable 'ansible_search_path' from source: unknown 18445 1726882533.99751: calling self._execute() 18445 1726882533.99828: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882533.99839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882533.99852: variable 'omit' from source: magic vars 18445 1726882534.00266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.02568: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.02632: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.02670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.02720: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.02747: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.02823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.02853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.02886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.02935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.02952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.03084: variable 'ansible_distribution' from source: facts 18445 1726882534.03094: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.03112: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.03118: when evaluation is False, skipping this task 18445 1726882534.03124: _execute() done 18445 1726882534.03129: dumping result to json 18445 1726882534.03140: done dumping result, returning 18445 1726882534.03150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-f6eb-935c-000000000048] 18445 1726882534.03158: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000048 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18445 1726882534.03278: no more pending results, returning what we have 18445 1726882534.03281: results queue empty 18445 1726882534.03282: checking for any_errors_fatal 18445 1726882534.03290: done checking for any_errors_fatal 18445 1726882534.03290: checking for max_fail_percentage 18445 1726882534.03292: done checking for max_fail_percentage 18445 1726882534.03293: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.03294: done checking to see if all hosts have failed 18445 1726882534.03295: getting the remaining hosts for this loop 18445 1726882534.03296: done getting the remaining hosts for this loop 18445 1726882534.03299: getting the next task for host managed_node1 18445 1726882534.03305: done getting next task for host managed_node1 18445 1726882534.03308: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18445 1726882534.03310: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.03321: getting variables 18445 1726882534.03322: in VariableManager get_vars() 18445 1726882534.03355: Calling all_inventory to load vars for managed_node1 18445 1726882534.03358: Calling groups_inventory to load vars for managed_node1 18445 1726882534.03360: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.03372: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.03375: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.03378: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.03542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.03786: done with get_vars() 18445 1726882534.03796: done getting variables 18445 1726882534.03854: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882534.04078: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000048 18445 1726882534.04082: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:34 -0400 (0:00:00.052) 0:00:05.977 ****** 18445 1726882534.04094: entering _queue_task() for managed_node1/service 18445 1726882534.04325: worker is 1 (out of 1 available) 18445 1726882534.04336: exiting _queue_task() for managed_node1/service 18445 1726882534.04347: done queuing things up, now waiting for results queue to drain 18445 1726882534.04349: waiting for pending results... 18445 1726882534.04594: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18445 1726882534.04691: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000049 18445 1726882534.04709: variable 'ansible_search_path' from source: unknown 18445 1726882534.04717: variable 'ansible_search_path' from source: unknown 18445 1726882534.04753: calling self._execute() 18445 1726882534.04833: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.04844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.04859: variable 'omit' from source: magic vars 18445 1726882534.05271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.07539: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.07611: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.07649: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.07689: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.07722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.07801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.07838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.07871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.07915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.07937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.08073: variable 'ansible_distribution' from source: facts 18445 1726882534.08084: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.08104: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.08111: when evaluation is False, skipping this task 18445 1726882534.08116: _execute() done 18445 1726882534.08123: dumping result to json 18445 1726882534.08129: done dumping result, returning 18445 1726882534.08139: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-f6eb-935c-000000000049] 18445 1726882534.08152: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000049 18445 1726882534.08247: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000049 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.08297: no more pending results, returning what we have 18445 1726882534.08301: results queue empty 18445 1726882534.08302: checking for any_errors_fatal 18445 1726882534.08307: done checking for any_errors_fatal 18445 1726882534.08308: checking for max_fail_percentage 18445 1726882534.08310: done checking for max_fail_percentage 18445 1726882534.08311: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.08312: done checking to see if all hosts have failed 18445 1726882534.08312: getting the remaining hosts for this loop 18445 1726882534.08314: done getting the remaining hosts for this loop 18445 1726882534.08317: getting the next task for host managed_node1 18445 1726882534.08323: done getting next task for host managed_node1 18445 1726882534.08327: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18445 1726882534.08329: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.08342: getting variables 18445 1726882534.08344: in VariableManager get_vars() 18445 1726882534.08382: Calling all_inventory to load vars for managed_node1 18445 1726882534.08385: Calling groups_inventory to load vars for managed_node1 18445 1726882534.08388: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.08397: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.08400: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.08403: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.08578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.08790: done with get_vars() 18445 1726882534.08801: done getting variables 18445 1726882534.08858: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882534.09084: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:34 -0400 (0:00:00.050) 0:00:06.027 ****** 18445 1726882534.09097: entering _queue_task() for managed_node1/service 18445 1726882534.09382: worker is 1 (out of 1 available) 18445 1726882534.09391: exiting _queue_task() for managed_node1/service 18445 1726882534.09402: done queuing things up, now waiting for results queue to drain 18445 1726882534.09403: waiting for pending results... 18445 1726882534.09659: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18445 1726882534.09759: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000004a 18445 1726882534.09780: variable 'ansible_search_path' from source: unknown 18445 1726882534.09786: variable 'ansible_search_path' from source: unknown 18445 1726882534.09821: calling self._execute() 18445 1726882534.09900: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.09909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.09920: variable 'omit' from source: magic vars 18445 1726882534.10358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.12788: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.12857: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.12903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.12953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.12988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.13073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.13112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.13144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.13190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.13214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.13356: variable 'ansible_distribution' from source: facts 18445 1726882534.13371: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.13395: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.13403: when evaluation is False, skipping this task 18445 1726882534.13409: _execute() done 18445 1726882534.13415: dumping result to json 18445 1726882534.13423: done dumping result, returning 18445 1726882534.13439: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-f6eb-935c-00000000004a] 18445 1726882534.13450: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18445 1726882534.13592: no more pending results, returning what we have 18445 1726882534.13596: results queue empty 18445 1726882534.13596: checking for any_errors_fatal 18445 1726882534.13603: done checking for any_errors_fatal 18445 1726882534.13604: checking for max_fail_percentage 18445 1726882534.13606: done checking for max_fail_percentage 18445 1726882534.13607: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.13608: done checking to see if all hosts have failed 18445 1726882534.13608: getting the remaining hosts for this loop 18445 1726882534.13610: done getting the remaining hosts for this loop 18445 1726882534.13614: getting the next task for host managed_node1 18445 1726882534.13620: done getting next task for host managed_node1 18445 1726882534.13624: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18445 1726882534.13626: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.13639: getting variables 18445 1726882534.13642: in VariableManager get_vars() 18445 1726882534.13683: Calling all_inventory to load vars for managed_node1 18445 1726882534.13686: Calling groups_inventory to load vars for managed_node1 18445 1726882534.13688: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.13699: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.13702: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.13705: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.13936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.14133: done with get_vars() 18445 1726882534.14144: done getting variables 18445 1726882534.14201: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:34 -0400 (0:00:00.051) 0:00:06.079 ****** 18445 1726882534.14232: entering _queue_task() for managed_node1/copy 18445 1726882534.14249: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004a 18445 1726882534.14256: WORKER PROCESS EXITING 18445 1726882534.14735: worker is 1 (out of 1 available) 18445 1726882534.14747: exiting _queue_task() for managed_node1/copy 18445 1726882534.14757: done queuing things up, now waiting for results queue to drain 18445 1726882534.14758: waiting for pending results... 18445 1726882534.15002: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18445 1726882534.15100: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000004b 18445 1726882534.15119: variable 'ansible_search_path' from source: unknown 18445 1726882534.15126: variable 'ansible_search_path' from source: unknown 18445 1726882534.15167: calling self._execute() 18445 1726882534.15247: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.15259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.15276: variable 'omit' from source: magic vars 18445 1726882534.15706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.18134: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.18202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.18246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.18287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.18319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.18482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.18516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.18583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.18706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.18727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.18973: variable 'ansible_distribution' from source: facts 18445 1726882534.19107: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.19129: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.19137: when evaluation is False, skipping this task 18445 1726882534.19143: _execute() done 18445 1726882534.19150: dumping result to json 18445 1726882534.19158: done dumping result, returning 18445 1726882534.19172: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-f6eb-935c-00000000004b] 18445 1726882534.19182: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004b 18445 1726882534.19291: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004b 18445 1726882534.19299: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.19357: no more pending results, returning what we have 18445 1726882534.19360: results queue empty 18445 1726882534.19361: checking for any_errors_fatal 18445 1726882534.19370: done checking for any_errors_fatal 18445 1726882534.19371: checking for max_fail_percentage 18445 1726882534.19373: done checking for max_fail_percentage 18445 1726882534.19374: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.19375: done checking to see if all hosts have failed 18445 1726882534.19376: getting the remaining hosts for this loop 18445 1726882534.19377: done getting the remaining hosts for this loop 18445 1726882534.19381: getting the next task for host managed_node1 18445 1726882534.19387: done getting next task for host managed_node1 18445 1726882534.19391: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18445 1726882534.19394: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.19406: getting variables 18445 1726882534.19408: in VariableManager get_vars() 18445 1726882534.19446: Calling all_inventory to load vars for managed_node1 18445 1726882534.19448: Calling groups_inventory to load vars for managed_node1 18445 1726882534.19451: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.19460: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.19467: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.19471: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.19647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.19854: done with get_vars() 18445 1726882534.19868: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:34 -0400 (0:00:00.057) 0:00:06.136 ****** 18445 1726882534.19958: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18445 1726882534.21326: worker is 1 (out of 1 available) 18445 1726882534.21337: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18445 1726882534.21349: done queuing things up, now waiting for results queue to drain 18445 1726882534.21351: waiting for pending results... 18445 1726882534.21844: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18445 1726882534.22046: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000004c 18445 1726882534.22139: variable 'ansible_search_path' from source: unknown 18445 1726882534.22147: variable 'ansible_search_path' from source: unknown 18445 1726882534.22189: calling self._execute() 18445 1726882534.22270: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.22281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.22297: variable 'omit' from source: magic vars 18445 1726882534.22700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.25116: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.25185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.25229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.25281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.25310: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.25500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.25535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.25569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.25613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.25630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.25772: variable 'ansible_distribution' from source: facts 18445 1726882534.25782: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.25803: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.25809: when evaluation is False, skipping this task 18445 1726882534.25815: _execute() done 18445 1726882534.25820: dumping result to json 18445 1726882534.25826: done dumping result, returning 18445 1726882534.25836: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-f6eb-935c-00000000004c] 18445 1726882534.25845: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004c 18445 1726882534.25954: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004c 18445 1726882534.25961: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.26024: no more pending results, returning what we have 18445 1726882534.26028: results queue empty 18445 1726882534.26029: checking for any_errors_fatal 18445 1726882534.26037: done checking for any_errors_fatal 18445 1726882534.26038: checking for max_fail_percentage 18445 1726882534.26039: done checking for max_fail_percentage 18445 1726882534.26041: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.26042: done checking to see if all hosts have failed 18445 1726882534.26042: getting the remaining hosts for this loop 18445 1726882534.26044: done getting the remaining hosts for this loop 18445 1726882534.26048: getting the next task for host managed_node1 18445 1726882534.26054: done getting next task for host managed_node1 18445 1726882534.26057: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18445 1726882534.26060: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.26074: getting variables 18445 1726882534.26076: in VariableManager get_vars() 18445 1726882534.26114: Calling all_inventory to load vars for managed_node1 18445 1726882534.26117: Calling groups_inventory to load vars for managed_node1 18445 1726882534.26119: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.26129: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.26132: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.26135: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.26358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.26560: done with get_vars() 18445 1726882534.26573: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:34 -0400 (0:00:00.066) 0:00:06.203 ****** 18445 1726882534.26657: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18445 1726882534.27135: worker is 1 (out of 1 available) 18445 1726882534.27147: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18445 1726882534.27158: done queuing things up, now waiting for results queue to drain 18445 1726882534.27159: waiting for pending results... 18445 1726882534.27433: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18445 1726882534.27532: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000004d 18445 1726882534.27552: variable 'ansible_search_path' from source: unknown 18445 1726882534.27559: variable 'ansible_search_path' from source: unknown 18445 1726882534.27603: calling self._execute() 18445 1726882534.27684: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.27695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.27709: variable 'omit' from source: magic vars 18445 1726882534.28142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.30476: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.30537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.30576: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.30611: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.30641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.30724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.30756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.30789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.30830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.30847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.30972: variable 'ansible_distribution' from source: facts 18445 1726882534.30988: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.31010: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.31017: when evaluation is False, skipping this task 18445 1726882534.31024: _execute() done 18445 1726882534.31031: dumping result to json 18445 1726882534.31038: done dumping result, returning 18445 1726882534.31051: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-f6eb-935c-00000000004d] 18445 1726882534.31065: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004d 18445 1726882534.31174: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004d 18445 1726882534.31182: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.31240: no more pending results, returning what we have 18445 1726882534.31243: results queue empty 18445 1726882534.31244: checking for any_errors_fatal 18445 1726882534.31252: done checking for any_errors_fatal 18445 1726882534.31253: checking for max_fail_percentage 18445 1726882534.31255: done checking for max_fail_percentage 18445 1726882534.31257: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.31258: done checking to see if all hosts have failed 18445 1726882534.31258: getting the remaining hosts for this loop 18445 1726882534.31260: done getting the remaining hosts for this loop 18445 1726882534.31266: getting the next task for host managed_node1 18445 1726882534.31272: done getting next task for host managed_node1 18445 1726882534.31275: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18445 1726882534.31278: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.31290: getting variables 18445 1726882534.31292: in VariableManager get_vars() 18445 1726882534.31328: Calling all_inventory to load vars for managed_node1 18445 1726882534.31331: Calling groups_inventory to load vars for managed_node1 18445 1726882534.31333: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.31343: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.31345: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.31348: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.31522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.31730: done with get_vars() 18445 1726882534.31741: done getting variables 18445 1726882534.31799: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:34 -0400 (0:00:00.051) 0:00:06.255 ****** 18445 1726882534.31833: entering _queue_task() for managed_node1/debug 18445 1726882534.32288: worker is 1 (out of 1 available) 18445 1726882534.32299: exiting _queue_task() for managed_node1/debug 18445 1726882534.32310: done queuing things up, now waiting for results queue to drain 18445 1726882534.32311: waiting for pending results... 18445 1726882534.32563: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18445 1726882534.32651: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000004e 18445 1726882534.32672: variable 'ansible_search_path' from source: unknown 18445 1726882534.32679: variable 'ansible_search_path' from source: unknown 18445 1726882534.32716: calling self._execute() 18445 1726882534.32795: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.32806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.32818: variable 'omit' from source: magic vars 18445 1726882534.33223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.35518: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.35588: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.35626: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.35678: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.35711: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.35795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.35829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.35859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.35913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.35934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.36068: variable 'ansible_distribution' from source: facts 18445 1726882534.36080: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.36102: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.36110: when evaluation is False, skipping this task 18445 1726882534.36121: _execute() done 18445 1726882534.36128: dumping result to json 18445 1726882534.36136: done dumping result, returning 18445 1726882534.36147: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-f6eb-935c-00000000004e] 18445 1726882534.36157: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004e skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882534.36297: no more pending results, returning what we have 18445 1726882534.36301: results queue empty 18445 1726882534.36302: checking for any_errors_fatal 18445 1726882534.36308: done checking for any_errors_fatal 18445 1726882534.36309: checking for max_fail_percentage 18445 1726882534.36311: done checking for max_fail_percentage 18445 1726882534.36312: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.36313: done checking to see if all hosts have failed 18445 1726882534.36314: getting the remaining hosts for this loop 18445 1726882534.36315: done getting the remaining hosts for this loop 18445 1726882534.36319: getting the next task for host managed_node1 18445 1726882534.36324: done getting next task for host managed_node1 18445 1726882534.36328: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18445 1726882534.36331: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.36343: getting variables 18445 1726882534.36345: in VariableManager get_vars() 18445 1726882534.36385: Calling all_inventory to load vars for managed_node1 18445 1726882534.36388: Calling groups_inventory to load vars for managed_node1 18445 1726882534.36390: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.36400: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.36403: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.36406: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.36580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.36829: done with get_vars() 18445 1726882534.36840: done getting variables 18445 1726882534.36902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:34 -0400 (0:00:00.050) 0:00:06.306 ****** 18445 1726882534.36936: entering _queue_task() for managed_node1/debug 18445 1726882534.36954: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004e 18445 1726882534.36963: WORKER PROCESS EXITING 18445 1726882534.37411: worker is 1 (out of 1 available) 18445 1726882534.37424: exiting _queue_task() for managed_node1/debug 18445 1726882534.37435: done queuing things up, now waiting for results queue to drain 18445 1726882534.37436: waiting for pending results... 18445 1726882534.37676: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18445 1726882534.37765: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000004f 18445 1726882534.37786: variable 'ansible_search_path' from source: unknown 18445 1726882534.37793: variable 'ansible_search_path' from source: unknown 18445 1726882534.37827: calling self._execute() 18445 1726882534.37906: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.37918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.37933: variable 'omit' from source: magic vars 18445 1726882534.38340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.40647: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.40716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.40758: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.40799: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.40827: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.40901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.40936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.40971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.41024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.41043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.41179: variable 'ansible_distribution' from source: facts 18445 1726882534.41191: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.41213: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.41220: when evaluation is False, skipping this task 18445 1726882534.41229: _execute() done 18445 1726882534.41237: dumping result to json 18445 1726882534.41245: done dumping result, returning 18445 1726882534.41255: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-f6eb-935c-00000000004f] 18445 1726882534.41267: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004f skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882534.41401: no more pending results, returning what we have 18445 1726882534.41405: results queue empty 18445 1726882534.41406: checking for any_errors_fatal 18445 1726882534.41412: done checking for any_errors_fatal 18445 1726882534.41413: checking for max_fail_percentage 18445 1726882534.41415: done checking for max_fail_percentage 18445 1726882534.41416: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.41417: done checking to see if all hosts have failed 18445 1726882534.41418: getting the remaining hosts for this loop 18445 1726882534.41419: done getting the remaining hosts for this loop 18445 1726882534.41423: getting the next task for host managed_node1 18445 1726882534.41429: done getting next task for host managed_node1 18445 1726882534.41433: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18445 1726882534.41435: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.41447: getting variables 18445 1726882534.41449: in VariableManager get_vars() 18445 1726882534.41489: Calling all_inventory to load vars for managed_node1 18445 1726882534.41492: Calling groups_inventory to load vars for managed_node1 18445 1726882534.41494: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.41505: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.41508: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.41511: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.41685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.41885: done with get_vars() 18445 1726882534.41897: done getting variables 18445 1726882534.41956: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882534.42209: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000004f 18445 1726882534.42212: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:34 -0400 (0:00:00.052) 0:00:06.359 ****** 18445 1726882534.42225: entering _queue_task() for managed_node1/debug 18445 1726882534.42448: worker is 1 (out of 1 available) 18445 1726882534.42459: exiting _queue_task() for managed_node1/debug 18445 1726882534.42472: done queuing things up, now waiting for results queue to drain 18445 1726882534.42473: waiting for pending results... 18445 1726882534.42727: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18445 1726882534.42825: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000050 18445 1726882534.42844: variable 'ansible_search_path' from source: unknown 18445 1726882534.42852: variable 'ansible_search_path' from source: unknown 18445 1726882534.42893: calling self._execute() 18445 1726882534.42969: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.42981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.42995: variable 'omit' from source: magic vars 18445 1726882534.43407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.45805: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.45879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.45921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.45977: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.46007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.46090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.46124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.46154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.46204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.46224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.46359: variable 'ansible_distribution' from source: facts 18445 1726882534.46373: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.46399: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.46407: when evaluation is False, skipping this task 18445 1726882534.46414: _execute() done 18445 1726882534.46421: dumping result to json 18445 1726882534.46429: done dumping result, returning 18445 1726882534.46440: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-f6eb-935c-000000000050] 18445 1726882534.46451: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000050 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882534.46591: no more pending results, returning what we have 18445 1726882534.46595: results queue empty 18445 1726882534.46596: checking for any_errors_fatal 18445 1726882534.46602: done checking for any_errors_fatal 18445 1726882534.46602: checking for max_fail_percentage 18445 1726882534.46604: done checking for max_fail_percentage 18445 1726882534.46606: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.46606: done checking to see if all hosts have failed 18445 1726882534.46607: getting the remaining hosts for this loop 18445 1726882534.46609: done getting the remaining hosts for this loop 18445 1726882534.46613: getting the next task for host managed_node1 18445 1726882534.46619: done getting next task for host managed_node1 18445 1726882534.46623: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18445 1726882534.46626: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.46639: getting variables 18445 1726882534.46641: in VariableManager get_vars() 18445 1726882534.46682: Calling all_inventory to load vars for managed_node1 18445 1726882534.46685: Calling groups_inventory to load vars for managed_node1 18445 1726882534.46687: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.46698: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.46701: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.46704: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.46931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.47155: done with get_vars() 18445 1726882534.47167: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:34 -0400 (0:00:00.050) 0:00:06.409 ****** 18445 1726882534.47265: entering _queue_task() for managed_node1/ping 18445 1726882534.47283: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000050 18445 1726882534.47292: WORKER PROCESS EXITING 18445 1726882534.47776: worker is 1 (out of 1 available) 18445 1726882534.47788: exiting _queue_task() for managed_node1/ping 18445 1726882534.47800: done queuing things up, now waiting for results queue to drain 18445 1726882534.47801: waiting for pending results... 18445 1726882534.48056: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18445 1726882534.48149: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000051 18445 1726882534.48171: variable 'ansible_search_path' from source: unknown 18445 1726882534.48178: variable 'ansible_search_path' from source: unknown 18445 1726882534.48217: calling self._execute() 18445 1726882534.48303: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.48315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.48329: variable 'omit' from source: magic vars 18445 1726882534.48722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.51075: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.51141: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.51188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.51226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.51255: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.51337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.51373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.51406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.51452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.51475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.51615: variable 'ansible_distribution' from source: facts 18445 1726882534.51626: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.51648: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.51656: when evaluation is False, skipping this task 18445 1726882534.51667: _execute() done 18445 1726882534.51675: dumping result to json 18445 1726882534.51683: done dumping result, returning 18445 1726882534.51693: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-f6eb-935c-000000000051] 18445 1726882534.51707: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000051 18445 1726882534.51789: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000051 18445 1726882534.51795: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.51851: no more pending results, returning what we have 18445 1726882534.51854: results queue empty 18445 1726882534.51855: checking for any_errors_fatal 18445 1726882534.51860: done checking for any_errors_fatal 18445 1726882534.51861: checking for max_fail_percentage 18445 1726882534.51865: done checking for max_fail_percentage 18445 1726882534.51866: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.51867: done checking to see if all hosts have failed 18445 1726882534.51868: getting the remaining hosts for this loop 18445 1726882534.51869: done getting the remaining hosts for this loop 18445 1726882534.51873: getting the next task for host managed_node1 18445 1726882534.51880: done getting next task for host managed_node1 18445 1726882534.51882: ^ task is: TASK: meta (role_complete) 18445 1726882534.51884: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.51897: getting variables 18445 1726882534.51900: in VariableManager get_vars() 18445 1726882534.51941: Calling all_inventory to load vars for managed_node1 18445 1726882534.51943: Calling groups_inventory to load vars for managed_node1 18445 1726882534.51946: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.51956: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.51960: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.51964: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.52163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.52360: done with get_vars() 18445 1726882534.52374: done getting variables 18445 1726882534.52458: done queuing things up, now waiting for results queue to drain 18445 1726882534.52460: results queue empty 18445 1726882534.52461: checking for any_errors_fatal 18445 1726882534.52464: done checking for any_errors_fatal 18445 1726882534.52700: checking for max_fail_percentage 18445 1726882534.52702: done checking for max_fail_percentage 18445 1726882534.52702: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.52703: done checking to see if all hosts have failed 18445 1726882534.52704: getting the remaining hosts for this loop 18445 1726882534.52705: done getting the remaining hosts for this loop 18445 1726882534.52707: getting the next task for host managed_node1 18445 1726882534.52710: done getting next task for host managed_node1 18445 1726882534.52712: ^ task is: TASK: meta (flush_handlers) 18445 1726882534.52713: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.52716: getting variables 18445 1726882534.52717: in VariableManager get_vars() 18445 1726882534.52729: Calling all_inventory to load vars for managed_node1 18445 1726882534.52731: Calling groups_inventory to load vars for managed_node1 18445 1726882534.52733: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.52738: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.52740: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.52743: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.52881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.53093: done with get_vars() 18445 1726882534.53102: done getting variables 18445 1726882534.53147: in VariableManager get_vars() 18445 1726882534.53157: Calling all_inventory to load vars for managed_node1 18445 1726882534.53165: Calling groups_inventory to load vars for managed_node1 18445 1726882534.53168: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.53172: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.53175: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.53178: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.53306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.54285: done with get_vars() 18445 1726882534.54298: done queuing things up, now waiting for results queue to drain 18445 1726882534.54300: results queue empty 18445 1726882534.54301: checking for any_errors_fatal 18445 1726882534.54302: done checking for any_errors_fatal 18445 1726882534.54303: checking for max_fail_percentage 18445 1726882534.54303: done checking for max_fail_percentage 18445 1726882534.54304: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.54305: done checking to see if all hosts have failed 18445 1726882534.54306: getting the remaining hosts for this loop 18445 1726882534.54306: done getting the remaining hosts for this loop 18445 1726882534.54309: getting the next task for host managed_node1 18445 1726882534.54312: done getting next task for host managed_node1 18445 1726882534.54313: ^ task is: TASK: meta (flush_handlers) 18445 1726882534.54315: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.54317: getting variables 18445 1726882534.54318: in VariableManager get_vars() 18445 1726882534.54328: Calling all_inventory to load vars for managed_node1 18445 1726882534.54329: Calling groups_inventory to load vars for managed_node1 18445 1726882534.54331: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.54336: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.54338: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.54341: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.54475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.54679: done with get_vars() 18445 1726882534.54687: done getting variables 18445 1726882534.54730: in VariableManager get_vars() 18445 1726882534.54740: Calling all_inventory to load vars for managed_node1 18445 1726882534.54742: Calling groups_inventory to load vars for managed_node1 18445 1726882534.54744: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.54748: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.54751: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.54753: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.55547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.55726: done with get_vars() 18445 1726882534.55738: done queuing things up, now waiting for results queue to drain 18445 1726882534.55739: results queue empty 18445 1726882534.55740: checking for any_errors_fatal 18445 1726882534.55742: done checking for any_errors_fatal 18445 1726882534.55742: checking for max_fail_percentage 18445 1726882534.55743: done checking for max_fail_percentage 18445 1726882534.55744: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.55745: done checking to see if all hosts have failed 18445 1726882534.55746: getting the remaining hosts for this loop 18445 1726882534.55746: done getting the remaining hosts for this loop 18445 1726882534.55749: getting the next task for host managed_node1 18445 1726882534.55752: done getting next task for host managed_node1 18445 1726882534.55752: ^ task is: None 18445 1726882534.55754: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.55755: done queuing things up, now waiting for results queue to drain 18445 1726882534.55756: results queue empty 18445 1726882534.55756: checking for any_errors_fatal 18445 1726882534.55757: done checking for any_errors_fatal 18445 1726882534.55758: checking for max_fail_percentage 18445 1726882534.55759: done checking for max_fail_percentage 18445 1726882534.55759: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.55760: done checking to see if all hosts have failed 18445 1726882534.55761: getting the next task for host managed_node1 18445 1726882534.55765: done getting next task for host managed_node1 18445 1726882534.55766: ^ task is: None 18445 1726882534.55767: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.55801: in VariableManager get_vars() 18445 1726882534.55815: done with get_vars() 18445 1726882534.55821: in VariableManager get_vars() 18445 1726882534.55830: done with get_vars() 18445 1726882534.55834: variable 'omit' from source: magic vars 18445 1726882534.55865: in VariableManager get_vars() 18445 1726882534.55876: done with get_vars() 18445 1726882534.55896: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 18445 1726882534.56226: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882534.56480: getting the remaining hosts for this loop 18445 1726882534.56481: done getting the remaining hosts for this loop 18445 1726882534.56484: getting the next task for host managed_node1 18445 1726882534.56486: done getting next task for host managed_node1 18445 1726882534.56490: ^ task is: TASK: Gathering Facts 18445 1726882534.56492: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.56494: getting variables 18445 1726882534.56495: in VariableManager get_vars() 18445 1726882534.56502: Calling all_inventory to load vars for managed_node1 18445 1726882534.56504: Calling groups_inventory to load vars for managed_node1 18445 1726882534.56506: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.56511: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.56513: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.56515: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.56652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.56836: done with get_vars() 18445 1726882534.56843: done getting variables 18445 1726882534.56879: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:35:34 -0400 (0:00:00.096) 0:00:06.506 ****** 18445 1726882534.56907: entering _queue_task() for managed_node1/gather_facts 18445 1726882534.57190: worker is 1 (out of 1 available) 18445 1726882534.57201: exiting _queue_task() for managed_node1/gather_facts 18445 1726882534.57211: done queuing things up, now waiting for results queue to drain 18445 1726882534.57213: waiting for pending results... 18445 1726882534.57459: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882534.57696: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000231 18445 1726882534.57714: variable 'ansible_search_path' from source: unknown 18445 1726882534.57743: calling self._execute() 18445 1726882534.57804: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.57817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.57833: variable 'omit' from source: magic vars 18445 1726882534.58248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.61369: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.61696: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.61736: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.61776: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.61810: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.61895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.61935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.61968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.62019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.62039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.62178: variable 'ansible_distribution' from source: facts 18445 1726882534.62188: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.62209: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.62216: when evaluation is False, skipping this task 18445 1726882534.62221: _execute() done 18445 1726882534.62232: dumping result to json 18445 1726882534.62239: done dumping result, returning 18445 1726882534.62248: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-000000000231] 18445 1726882534.62257: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000231 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.62487: no more pending results, returning what we have 18445 1726882534.62491: results queue empty 18445 1726882534.62492: checking for any_errors_fatal 18445 1726882534.62494: done checking for any_errors_fatal 18445 1726882534.62494: checking for max_fail_percentage 18445 1726882534.62496: done checking for max_fail_percentage 18445 1726882534.62497: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.62498: done checking to see if all hosts have failed 18445 1726882534.62499: getting the remaining hosts for this loop 18445 1726882534.62501: done getting the remaining hosts for this loop 18445 1726882534.62504: getting the next task for host managed_node1 18445 1726882534.62510: done getting next task for host managed_node1 18445 1726882534.62512: ^ task is: TASK: meta (flush_handlers) 18445 1726882534.62514: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.62517: getting variables 18445 1726882534.62519: in VariableManager get_vars() 18445 1726882534.62546: Calling all_inventory to load vars for managed_node1 18445 1726882534.62549: Calling groups_inventory to load vars for managed_node1 18445 1726882534.62552: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.62565: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.62569: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.62573: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.62717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.62911: done with get_vars() 18445 1726882534.62922: done getting variables 18445 1726882534.63229: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000231 18445 1726882534.63232: WORKER PROCESS EXITING 18445 1726882534.63260: in VariableManager get_vars() 18445 1726882534.63270: Calling all_inventory to load vars for managed_node1 18445 1726882534.63272: Calling groups_inventory to load vars for managed_node1 18445 1726882534.63274: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.63278: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.63280: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.63283: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.63412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.63814: done with get_vars() 18445 1726882534.63827: done queuing things up, now waiting for results queue to drain 18445 1726882534.63829: results queue empty 18445 1726882534.63830: checking for any_errors_fatal 18445 1726882534.63832: done checking for any_errors_fatal 18445 1726882534.63832: checking for max_fail_percentage 18445 1726882534.63834: done checking for max_fail_percentage 18445 1726882534.63834: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.63835: done checking to see if all hosts have failed 18445 1726882534.63836: getting the remaining hosts for this loop 18445 1726882534.63837: done getting the remaining hosts for this loop 18445 1726882534.63839: getting the next task for host managed_node1 18445 1726882534.63843: done getting next task for host managed_node1 18445 1726882534.63846: ^ task is: TASK: Include the task 'delete_interface.yml' 18445 1726882534.63847: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.63849: getting variables 18445 1726882534.63850: in VariableManager get_vars() 18445 1726882534.63858: Calling all_inventory to load vars for managed_node1 18445 1726882534.63860: Calling groups_inventory to load vars for managed_node1 18445 1726882534.63862: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.63874: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.63877: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.63880: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.64014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.64899: done with get_vars() 18445 1726882534.64907: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:35:34 -0400 (0:00:00.080) 0:00:06.586 ****** 18445 1726882534.64981: entering _queue_task() for managed_node1/include_tasks 18445 1726882534.65243: worker is 1 (out of 1 available) 18445 1726882534.65254: exiting _queue_task() for managed_node1/include_tasks 18445 1726882534.65468: done queuing things up, now waiting for results queue to drain 18445 1726882534.65470: waiting for pending results... 18445 1726882534.66157: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 18445 1726882534.66347: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000054 18445 1726882534.66364: variable 'ansible_search_path' from source: unknown 18445 1726882534.66398: calling self._execute() 18445 1726882534.66767: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.66773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.66783: variable 'omit' from source: magic vars 18445 1726882534.67677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.71219: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.71296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.71345: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.71401: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.71441: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.71531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.71585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.71618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.71675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.71695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.71845: variable 'ansible_distribution' from source: facts 18445 1726882534.71866: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.71894: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.71903: when evaluation is False, skipping this task 18445 1726882534.71909: _execute() done 18445 1726882534.71916: dumping result to json 18445 1726882534.71923: done dumping result, returning 18445 1726882534.71934: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0e448fcc-3ce9-f6eb-935c-000000000054] 18445 1726882534.71944: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000054 18445 1726882534.72068: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000054 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.72121: no more pending results, returning what we have 18445 1726882534.72125: results queue empty 18445 1726882534.72126: checking for any_errors_fatal 18445 1726882534.72128: done checking for any_errors_fatal 18445 1726882534.72129: checking for max_fail_percentage 18445 1726882534.72130: done checking for max_fail_percentage 18445 1726882534.72131: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.72132: done checking to see if all hosts have failed 18445 1726882534.72133: getting the remaining hosts for this loop 18445 1726882534.72134: done getting the remaining hosts for this loop 18445 1726882534.72138: getting the next task for host managed_node1 18445 1726882534.72144: done getting next task for host managed_node1 18445 1726882534.72146: ^ task is: TASK: meta (flush_handlers) 18445 1726882534.72148: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.72151: getting variables 18445 1726882534.72153: in VariableManager get_vars() 18445 1726882534.72181: Calling all_inventory to load vars for managed_node1 18445 1726882534.72183: Calling groups_inventory to load vars for managed_node1 18445 1726882534.72187: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.72194: WORKER PROCESS EXITING 18445 1726882534.72205: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.72208: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.72211: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.72375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.72610: done with get_vars() 18445 1726882534.72619: done getting variables 18445 1726882534.72686: in VariableManager get_vars() 18445 1726882534.72694: Calling all_inventory to load vars for managed_node1 18445 1726882534.72696: Calling groups_inventory to load vars for managed_node1 18445 1726882534.72699: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.72703: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.72705: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.72708: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.72834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.73090: done with get_vars() 18445 1726882534.73102: done queuing things up, now waiting for results queue to drain 18445 1726882534.73104: results queue empty 18445 1726882534.73105: checking for any_errors_fatal 18445 1726882534.73107: done checking for any_errors_fatal 18445 1726882534.73108: checking for max_fail_percentage 18445 1726882534.73109: done checking for max_fail_percentage 18445 1726882534.73109: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.73110: done checking to see if all hosts have failed 18445 1726882534.73111: getting the remaining hosts for this loop 18445 1726882534.73112: done getting the remaining hosts for this loop 18445 1726882534.73114: getting the next task for host managed_node1 18445 1726882534.73117: done getting next task for host managed_node1 18445 1726882534.73118: ^ task is: TASK: meta (flush_handlers) 18445 1726882534.73120: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.73122: getting variables 18445 1726882534.73123: in VariableManager get_vars() 18445 1726882534.73131: Calling all_inventory to load vars for managed_node1 18445 1726882534.73133: Calling groups_inventory to load vars for managed_node1 18445 1726882534.73135: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.73144: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.73146: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.73149: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.73262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.73419: done with get_vars() 18445 1726882534.73428: done getting variables 18445 1726882534.73479: in VariableManager get_vars() 18445 1726882534.73487: Calling all_inventory to load vars for managed_node1 18445 1726882534.73489: Calling groups_inventory to load vars for managed_node1 18445 1726882534.73491: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.73495: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.73497: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.73500: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.73859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.74042: done with get_vars() 18445 1726882534.74053: done queuing things up, now waiting for results queue to drain 18445 1726882534.74055: results queue empty 18445 1726882534.74056: checking for any_errors_fatal 18445 1726882534.74057: done checking for any_errors_fatal 18445 1726882534.74058: checking for max_fail_percentage 18445 1726882534.74058: done checking for max_fail_percentage 18445 1726882534.74059: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.74060: done checking to see if all hosts have failed 18445 1726882534.74061: getting the remaining hosts for this loop 18445 1726882534.74062: done getting the remaining hosts for this loop 18445 1726882534.74065: getting the next task for host managed_node1 18445 1726882534.74169: done getting next task for host managed_node1 18445 1726882534.74171: ^ task is: None 18445 1726882534.74172: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.74173: done queuing things up, now waiting for results queue to drain 18445 1726882534.74174: results queue empty 18445 1726882534.74175: checking for any_errors_fatal 18445 1726882534.74176: done checking for any_errors_fatal 18445 1726882534.74176: checking for max_fail_percentage 18445 1726882534.74177: done checking for max_fail_percentage 18445 1726882534.74178: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.74179: done checking to see if all hosts have failed 18445 1726882534.74180: getting the next task for host managed_node1 18445 1726882534.74182: done getting next task for host managed_node1 18445 1726882534.74183: ^ task is: None 18445 1726882534.74184: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.74276: in VariableManager get_vars() 18445 1726882534.74297: done with get_vars() 18445 1726882534.74303: in VariableManager get_vars() 18445 1726882534.74316: done with get_vars() 18445 1726882534.74320: variable 'omit' from source: magic vars 18445 1726882534.74433: variable 'profile' from source: play vars 18445 1726882534.74630: in VariableManager get_vars() 18445 1726882534.74644: done with get_vars() 18445 1726882534.74669: variable 'omit' from source: magic vars 18445 1726882534.74741: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 18445 1726882534.75471: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882534.75496: getting the remaining hosts for this loop 18445 1726882534.75497: done getting the remaining hosts for this loop 18445 1726882534.75500: getting the next task for host managed_node1 18445 1726882534.75502: done getting next task for host managed_node1 18445 1726882534.75504: ^ task is: TASK: Gathering Facts 18445 1726882534.75506: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.75508: getting variables 18445 1726882534.75509: in VariableManager get_vars() 18445 1726882534.75519: Calling all_inventory to load vars for managed_node1 18445 1726882534.75521: Calling groups_inventory to load vars for managed_node1 18445 1726882534.75523: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.75527: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.75530: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.75533: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.75701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.75882: done with get_vars() 18445 1726882534.75890: done getting variables 18445 1726882534.75927: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:35:34 -0400 (0:00:00.109) 0:00:06.696 ****** 18445 1726882534.75950: entering _queue_task() for managed_node1/gather_facts 18445 1726882534.76239: worker is 1 (out of 1 available) 18445 1726882534.76251: exiting _queue_task() for managed_node1/gather_facts 18445 1726882534.76267: done queuing things up, now waiting for results queue to drain 18445 1726882534.76269: waiting for pending results... 18445 1726882534.76536: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882534.76635: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000246 18445 1726882534.76652: variable 'ansible_search_path' from source: unknown 18445 1726882534.76693: calling self._execute() 18445 1726882534.76779: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.76789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.76800: variable 'omit' from source: magic vars 18445 1726882534.77223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.79744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.79817: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.79857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.79903: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.79934: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.80033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.80069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.80106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.80174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.80194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.80519: variable 'ansible_distribution' from source: facts 18445 1726882534.80532: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.80560: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.80571: when evaluation is False, skipping this task 18445 1726882534.80578: _execute() done 18445 1726882534.80584: dumping result to json 18445 1726882534.80590: done dumping result, returning 18445 1726882534.80599: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-000000000246] 18445 1726882534.80608: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000246 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.80740: no more pending results, returning what we have 18445 1726882534.80744: results queue empty 18445 1726882534.80745: checking for any_errors_fatal 18445 1726882534.80746: done checking for any_errors_fatal 18445 1726882534.80746: checking for max_fail_percentage 18445 1726882534.80748: done checking for max_fail_percentage 18445 1726882534.80749: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.80750: done checking to see if all hosts have failed 18445 1726882534.80750: getting the remaining hosts for this loop 18445 1726882534.80752: done getting the remaining hosts for this loop 18445 1726882534.80757: getting the next task for host managed_node1 18445 1726882534.80762: done getting next task for host managed_node1 18445 1726882534.80769: ^ task is: TASK: meta (flush_handlers) 18445 1726882534.80771: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.80774: getting variables 18445 1726882534.80776: in VariableManager get_vars() 18445 1726882534.80810: Calling all_inventory to load vars for managed_node1 18445 1726882534.80812: Calling groups_inventory to load vars for managed_node1 18445 1726882534.80814: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.80822: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000246 18445 1726882534.80826: WORKER PROCESS EXITING 18445 1726882534.80837: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.80840: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.80842: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.81018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.81237: done with get_vars() 18445 1726882534.81247: done getting variables 18445 1726882534.81322: in VariableManager get_vars() 18445 1726882534.81333: Calling all_inventory to load vars for managed_node1 18445 1726882534.81335: Calling groups_inventory to load vars for managed_node1 18445 1726882534.81337: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.81341: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.81343: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.81346: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.81674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.82051: done with get_vars() 18445 1726882534.82066: done queuing things up, now waiting for results queue to drain 18445 1726882534.82068: results queue empty 18445 1726882534.82069: checking for any_errors_fatal 18445 1726882534.82071: done checking for any_errors_fatal 18445 1726882534.82072: checking for max_fail_percentage 18445 1726882534.82073: done checking for max_fail_percentage 18445 1726882534.82074: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.82074: done checking to see if all hosts have failed 18445 1726882534.82075: getting the remaining hosts for this loop 18445 1726882534.82076: done getting the remaining hosts for this loop 18445 1726882534.82078: getting the next task for host managed_node1 18445 1726882534.82082: done getting next task for host managed_node1 18445 1726882534.82085: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18445 1726882534.82087: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.82096: getting variables 18445 1726882534.82097: in VariableManager get_vars() 18445 1726882534.82112: Calling all_inventory to load vars for managed_node1 18445 1726882534.82119: Calling groups_inventory to load vars for managed_node1 18445 1726882534.82121: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.82132: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.82134: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.82137: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.82295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.82498: done with get_vars() 18445 1726882534.82506: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:34 -0400 (0:00:00.066) 0:00:06.762 ****** 18445 1726882534.82583: entering _queue_task() for managed_node1/include_tasks 18445 1726882534.82843: worker is 1 (out of 1 available) 18445 1726882534.82855: exiting _queue_task() for managed_node1/include_tasks 18445 1726882534.82867: done queuing things up, now waiting for results queue to drain 18445 1726882534.82868: waiting for pending results... 18445 1726882534.83137: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18445 1726882534.83240: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000005c 18445 1726882534.83260: variable 'ansible_search_path' from source: unknown 18445 1726882534.83270: variable 'ansible_search_path' from source: unknown 18445 1726882534.83313: calling self._execute() 18445 1726882534.83396: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.83406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.83427: variable 'omit' from source: magic vars 18445 1726882534.83843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.87162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.87343: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.87431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.87624: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.87655: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.87838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.87950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.87984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.88108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.88296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.88566: variable 'ansible_distribution' from source: facts 18445 1726882534.88584: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.88619: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.88627: when evaluation is False, skipping this task 18445 1726882534.88633: _execute() done 18445 1726882534.88640: dumping result to json 18445 1726882534.88646: done dumping result, returning 18445 1726882534.88658: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-f6eb-935c-00000000005c] 18445 1726882534.88675: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005c 18445 1726882534.88819: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882534.88874: no more pending results, returning what we have 18445 1726882534.88879: results queue empty 18445 1726882534.88879: checking for any_errors_fatal 18445 1726882534.88881: done checking for any_errors_fatal 18445 1726882534.88882: checking for max_fail_percentage 18445 1726882534.88887: done checking for max_fail_percentage 18445 1726882534.88888: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.88889: done checking to see if all hosts have failed 18445 1726882534.88890: getting the remaining hosts for this loop 18445 1726882534.88892: done getting the remaining hosts for this loop 18445 1726882534.88897: getting the next task for host managed_node1 18445 1726882534.88904: done getting next task for host managed_node1 18445 1726882534.88908: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18445 1726882534.88911: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.88926: getting variables 18445 1726882534.88928: in VariableManager get_vars() 18445 1726882534.88973: Calling all_inventory to load vars for managed_node1 18445 1726882534.88976: Calling groups_inventory to load vars for managed_node1 18445 1726882534.88979: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.88989: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.88991: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.88993: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.89187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.89419: done with get_vars() 18445 1726882534.89431: done getting variables 18445 1726882534.89612: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 18445 1726882534.89636: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:34 -0400 (0:00:00.070) 0:00:06.833 ****** 18445 1726882534.89652: entering _queue_task() for managed_node1/debug 18445 1726882534.90091: worker is 1 (out of 1 available) 18445 1726882534.90103: exiting _queue_task() for managed_node1/debug 18445 1726882534.90113: done queuing things up, now waiting for results queue to drain 18445 1726882534.90115: waiting for pending results... 18445 1726882534.90389: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18445 1726882534.90492: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000005d 18445 1726882534.90517: variable 'ansible_search_path' from source: unknown 18445 1726882534.90524: variable 'ansible_search_path' from source: unknown 18445 1726882534.90573: calling self._execute() 18445 1726882534.90671: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.90682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.90699: variable 'omit' from source: magic vars 18445 1726882534.91257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882534.93602: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882534.93688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882534.93735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882534.93776: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882534.93810: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882534.93901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882534.93946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882534.93981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882534.94033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882534.94162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882534.94446: variable 'ansible_distribution' from source: facts 18445 1726882534.94457: variable 'ansible_distribution_major_version' from source: facts 18445 1726882534.94483: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882534.94491: when evaluation is False, skipping this task 18445 1726882534.94497: _execute() done 18445 1726882534.94504: dumping result to json 18445 1726882534.94510: done dumping result, returning 18445 1726882534.94520: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-f6eb-935c-00000000005d] 18445 1726882534.94530: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005d 18445 1726882534.94654: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005d skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882534.94705: no more pending results, returning what we have 18445 1726882534.94709: results queue empty 18445 1726882534.94709: checking for any_errors_fatal 18445 1726882534.94715: done checking for any_errors_fatal 18445 1726882534.94716: checking for max_fail_percentage 18445 1726882534.94718: done checking for max_fail_percentage 18445 1726882534.94719: checking to see if all hosts have failed and the running result is not ok 18445 1726882534.94720: done checking to see if all hosts have failed 18445 1726882534.94720: getting the remaining hosts for this loop 18445 1726882534.94722: done getting the remaining hosts for this loop 18445 1726882534.94726: getting the next task for host managed_node1 18445 1726882534.94732: done getting next task for host managed_node1 18445 1726882534.94736: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18445 1726882534.94738: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882534.94750: getting variables 18445 1726882534.94752: in VariableManager get_vars() 18445 1726882534.94793: Calling all_inventory to load vars for managed_node1 18445 1726882534.94796: Calling groups_inventory to load vars for managed_node1 18445 1726882534.94798: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882534.94808: Calling all_plugins_play to load vars for managed_node1 18445 1726882534.94811: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882534.94814: Calling groups_plugins_play to load vars for managed_node1 18445 1726882534.95067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882534.95280: done with get_vars() 18445 1726882534.95291: done getting variables 18445 1726882534.95393: WORKER PROCESS EXITING 18445 1726882534.95438: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:34 -0400 (0:00:00.058) 0:00:06.891 ****** 18445 1726882534.95475: entering _queue_task() for managed_node1/fail 18445 1726882534.96089: worker is 1 (out of 1 available) 18445 1726882534.96105: exiting _queue_task() for managed_node1/fail 18445 1726882534.96118: done queuing things up, now waiting for results queue to drain 18445 1726882534.96136: waiting for pending results... 18445 1726882534.96435: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18445 1726882534.96551: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000005e 18445 1726882534.96577: variable 'ansible_search_path' from source: unknown 18445 1726882534.96586: variable 'ansible_search_path' from source: unknown 18445 1726882534.96631: calling self._execute() 18445 1726882534.96718: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882534.96732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882534.96748: variable 'omit' from source: magic vars 18445 1726882534.97187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.00898: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.00969: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.01018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.01070: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.01109: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.01196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.01236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.01270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.01324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.01344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.01490: variable 'ansible_distribution' from source: facts 18445 1726882535.01500: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.01526: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.01536: when evaluation is False, skipping this task 18445 1726882535.01542: _execute() done 18445 1726882535.01548: dumping result to json 18445 1726882535.01554: done dumping result, returning 18445 1726882535.01566: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-f6eb-935c-00000000005e] 18445 1726882535.01576: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.01716: no more pending results, returning what we have 18445 1726882535.01720: results queue empty 18445 1726882535.01721: checking for any_errors_fatal 18445 1726882535.01728: done checking for any_errors_fatal 18445 1726882535.01729: checking for max_fail_percentage 18445 1726882535.01731: done checking for max_fail_percentage 18445 1726882535.01732: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.01732: done checking to see if all hosts have failed 18445 1726882535.01733: getting the remaining hosts for this loop 18445 1726882535.01735: done getting the remaining hosts for this loop 18445 1726882535.01739: getting the next task for host managed_node1 18445 1726882535.01744: done getting next task for host managed_node1 18445 1726882535.01749: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18445 1726882535.01752: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.01770: getting variables 18445 1726882535.01773: in VariableManager get_vars() 18445 1726882535.01811: Calling all_inventory to load vars for managed_node1 18445 1726882535.01813: Calling groups_inventory to load vars for managed_node1 18445 1726882535.01816: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.01826: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.01829: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.01831: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.02011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.02229: done with get_vars() 18445 1726882535.02241: done getting variables 18445 1726882535.02310: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:35 -0400 (0:00:00.068) 0:00:06.960 ****** 18445 1726882535.02346: entering _queue_task() for managed_node1/fail 18445 1726882535.02367: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005e 18445 1726882535.02376: WORKER PROCESS EXITING 18445 1726882535.02904: worker is 1 (out of 1 available) 18445 1726882535.02914: exiting _queue_task() for managed_node1/fail 18445 1726882535.02925: done queuing things up, now waiting for results queue to drain 18445 1726882535.02926: waiting for pending results... 18445 1726882535.03190: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18445 1726882535.03287: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000005f 18445 1726882535.03305: variable 'ansible_search_path' from source: unknown 18445 1726882535.03312: variable 'ansible_search_path' from source: unknown 18445 1726882535.03351: calling self._execute() 18445 1726882535.03436: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.03447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.03461: variable 'omit' from source: magic vars 18445 1726882535.03955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.06384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.06472: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.06512: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.06560: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.06595: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.06682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.06717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.06757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.06806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.06827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.06980: variable 'ansible_distribution' from source: facts 18445 1726882535.06991: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.07013: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.07020: when evaluation is False, skipping this task 18445 1726882535.07026: _execute() done 18445 1726882535.07032: dumping result to json 18445 1726882535.07041: done dumping result, returning 18445 1726882535.07052: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-f6eb-935c-00000000005f] 18445 1726882535.07068: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.07210: no more pending results, returning what we have 18445 1726882535.07214: results queue empty 18445 1726882535.07215: checking for any_errors_fatal 18445 1726882535.07222: done checking for any_errors_fatal 18445 1726882535.07223: checking for max_fail_percentage 18445 1726882535.07225: done checking for max_fail_percentage 18445 1726882535.07227: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.07228: done checking to see if all hosts have failed 18445 1726882535.07228: getting the remaining hosts for this loop 18445 1726882535.07230: done getting the remaining hosts for this loop 18445 1726882535.07234: getting the next task for host managed_node1 18445 1726882535.07240: done getting next task for host managed_node1 18445 1726882535.07245: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18445 1726882535.07247: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.07259: getting variables 18445 1726882535.07261: in VariableManager get_vars() 18445 1726882535.07362: Calling all_inventory to load vars for managed_node1 18445 1726882535.07367: Calling groups_inventory to load vars for managed_node1 18445 1726882535.07370: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.07384: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.07387: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.07391: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.07557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.07770: done with get_vars() 18445 1726882535.07780: done getting variables 18445 1726882535.07871: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000005f 18445 1726882535.07874: WORKER PROCESS EXITING 18445 1726882535.07909: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:35 -0400 (0:00:00.057) 0:00:07.017 ****** 18445 1726882535.08055: entering _queue_task() for managed_node1/fail 18445 1726882535.08413: worker is 1 (out of 1 available) 18445 1726882535.08425: exiting _queue_task() for managed_node1/fail 18445 1726882535.08435: done queuing things up, now waiting for results queue to drain 18445 1726882535.08437: waiting for pending results... 18445 1726882535.08700: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18445 1726882535.08798: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000060 18445 1726882535.08815: variable 'ansible_search_path' from source: unknown 18445 1726882535.08822: variable 'ansible_search_path' from source: unknown 18445 1726882535.08859: calling self._execute() 18445 1726882535.08949: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.08962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.08980: variable 'omit' from source: magic vars 18445 1726882535.09404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.11842: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.11913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.11958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.12009: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.12049: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.12129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.12173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.12205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.12260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.12283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.12423: variable 'ansible_distribution' from source: facts 18445 1726882535.12436: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.12460: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.12475: when evaluation is False, skipping this task 18445 1726882535.12483: _execute() done 18445 1726882535.12490: dumping result to json 18445 1726882535.12499: done dumping result, returning 18445 1726882535.12510: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-f6eb-935c-000000000060] 18445 1726882535.12520: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.12659: no more pending results, returning what we have 18445 1726882535.12665: results queue empty 18445 1726882535.12666: checking for any_errors_fatal 18445 1726882535.12671: done checking for any_errors_fatal 18445 1726882535.12672: checking for max_fail_percentage 18445 1726882535.12674: done checking for max_fail_percentage 18445 1726882535.12675: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.12676: done checking to see if all hosts have failed 18445 1726882535.12677: getting the remaining hosts for this loop 18445 1726882535.12678: done getting the remaining hosts for this loop 18445 1726882535.12682: getting the next task for host managed_node1 18445 1726882535.12688: done getting next task for host managed_node1 18445 1726882535.12692: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18445 1726882535.12694: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.12705: getting variables 18445 1726882535.12707: in VariableManager get_vars() 18445 1726882535.12745: Calling all_inventory to load vars for managed_node1 18445 1726882535.12748: Calling groups_inventory to load vars for managed_node1 18445 1726882535.12750: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.12760: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.12769: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.12773: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.12949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.13161: done with get_vars() 18445 1726882535.13175: done getting variables 18445 1726882535.13237: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:35 -0400 (0:00:00.052) 0:00:07.069 ****** 18445 1726882535.13271: entering _queue_task() for managed_node1/dnf 18445 1726882535.13289: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000060 18445 1726882535.13299: WORKER PROCESS EXITING 18445 1726882535.13778: worker is 1 (out of 1 available) 18445 1726882535.13789: exiting _queue_task() for managed_node1/dnf 18445 1726882535.13800: done queuing things up, now waiting for results queue to drain 18445 1726882535.13802: waiting for pending results... 18445 1726882535.14055: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18445 1726882535.14151: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000061 18445 1726882535.14171: variable 'ansible_search_path' from source: unknown 18445 1726882535.14183: variable 'ansible_search_path' from source: unknown 18445 1726882535.14220: calling self._execute() 18445 1726882535.14373: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.14385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.14405: variable 'omit' from source: magic vars 18445 1726882535.14820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.17190: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.17268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.17310: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.17350: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.17384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.17467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.17502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.17531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.17579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.17597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.17737: variable 'ansible_distribution' from source: facts 18445 1726882535.17748: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.17774: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.17781: when evaluation is False, skipping this task 18445 1726882535.17788: _execute() done 18445 1726882535.17794: dumping result to json 18445 1726882535.17800: done dumping result, returning 18445 1726882535.17809: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000061] 18445 1726882535.17820: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000061 18445 1726882535.17919: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000061 18445 1726882535.17925: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.18033: no more pending results, returning what we have 18445 1726882535.18036: results queue empty 18445 1726882535.18037: checking for any_errors_fatal 18445 1726882535.18041: done checking for any_errors_fatal 18445 1726882535.18042: checking for max_fail_percentage 18445 1726882535.18044: done checking for max_fail_percentage 18445 1726882535.18045: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.18045: done checking to see if all hosts have failed 18445 1726882535.18046: getting the remaining hosts for this loop 18445 1726882535.18048: done getting the remaining hosts for this loop 18445 1726882535.18051: getting the next task for host managed_node1 18445 1726882535.18056: done getting next task for host managed_node1 18445 1726882535.18060: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18445 1726882535.18062: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.18074: getting variables 18445 1726882535.18076: in VariableManager get_vars() 18445 1726882535.18105: Calling all_inventory to load vars for managed_node1 18445 1726882535.18107: Calling groups_inventory to load vars for managed_node1 18445 1726882535.18110: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.18119: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.18122: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.18124: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.18282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.18481: done with get_vars() 18445 1726882535.18495: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18445 1726882535.18570: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:35 -0400 (0:00:00.054) 0:00:07.123 ****** 18445 1726882535.18679: entering _queue_task() for managed_node1/yum 18445 1726882535.19035: worker is 1 (out of 1 available) 18445 1726882535.19046: exiting _queue_task() for managed_node1/yum 18445 1726882535.19056: done queuing things up, now waiting for results queue to drain 18445 1726882535.19058: waiting for pending results... 18445 1726882535.19299: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18445 1726882535.19395: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000062 18445 1726882535.19412: variable 'ansible_search_path' from source: unknown 18445 1726882535.19419: variable 'ansible_search_path' from source: unknown 18445 1726882535.19453: calling self._execute() 18445 1726882535.19540: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.19550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.19562: variable 'omit' from source: magic vars 18445 1726882535.19977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.22082: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.22125: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.22160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.22187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.22207: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.22259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.22288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.22306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.22333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.22344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.22432: variable 'ansible_distribution' from source: facts 18445 1726882535.22438: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.22451: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.22457: when evaluation is False, skipping this task 18445 1726882535.22459: _execute() done 18445 1726882535.22462: dumping result to json 18445 1726882535.22466: done dumping result, returning 18445 1726882535.22469: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000062] 18445 1726882535.22474: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000062 18445 1726882535.22557: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000062 18445 1726882535.22561: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.22606: no more pending results, returning what we have 18445 1726882535.22609: results queue empty 18445 1726882535.22609: checking for any_errors_fatal 18445 1726882535.22614: done checking for any_errors_fatal 18445 1726882535.22614: checking for max_fail_percentage 18445 1726882535.22616: done checking for max_fail_percentage 18445 1726882535.22616: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.22617: done checking to see if all hosts have failed 18445 1726882535.22618: getting the remaining hosts for this loop 18445 1726882535.22619: done getting the remaining hosts for this loop 18445 1726882535.22622: getting the next task for host managed_node1 18445 1726882535.22626: done getting next task for host managed_node1 18445 1726882535.22630: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18445 1726882535.22631: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.22642: getting variables 18445 1726882535.22643: in VariableManager get_vars() 18445 1726882535.22675: Calling all_inventory to load vars for managed_node1 18445 1726882535.22677: Calling groups_inventory to load vars for managed_node1 18445 1726882535.22678: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.22684: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.22686: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.22687: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.22789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.22926: done with get_vars() 18445 1726882535.22933: done getting variables 18445 1726882535.22973: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:35 -0400 (0:00:00.043) 0:00:07.166 ****** 18445 1726882535.22992: entering _queue_task() for managed_node1/fail 18445 1726882535.23143: worker is 1 (out of 1 available) 18445 1726882535.23154: exiting _queue_task() for managed_node1/fail 18445 1726882535.23164: done queuing things up, now waiting for results queue to drain 18445 1726882535.23166: waiting for pending results... 18445 1726882535.23335: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18445 1726882535.23397: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000063 18445 1726882535.23409: variable 'ansible_search_path' from source: unknown 18445 1726882535.23413: variable 'ansible_search_path' from source: unknown 18445 1726882535.23446: calling self._execute() 18445 1726882535.23522: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.23525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.23534: variable 'omit' from source: magic vars 18445 1726882535.23876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.26450: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.26521: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.26572: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.26611: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.26649: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.26759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.26797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.26826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.26872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.26891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.27009: variable 'ansible_distribution' from source: facts 18445 1726882535.27021: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.27042: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.27052: when evaluation is False, skipping this task 18445 1726882535.27059: _execute() done 18445 1726882535.27116: dumping result to json 18445 1726882535.27126: done dumping result, returning 18445 1726882535.27136: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000063] 18445 1726882535.27146: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000063 18445 1726882535.27253: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000063 18445 1726882535.27263: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.27311: no more pending results, returning what we have 18445 1726882535.27315: results queue empty 18445 1726882535.27316: checking for any_errors_fatal 18445 1726882535.27321: done checking for any_errors_fatal 18445 1726882535.27322: checking for max_fail_percentage 18445 1726882535.27324: done checking for max_fail_percentage 18445 1726882535.27325: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.27325: done checking to see if all hosts have failed 18445 1726882535.27326: getting the remaining hosts for this loop 18445 1726882535.27328: done getting the remaining hosts for this loop 18445 1726882535.27331: getting the next task for host managed_node1 18445 1726882535.27337: done getting next task for host managed_node1 18445 1726882535.27340: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18445 1726882535.27342: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.27354: getting variables 18445 1726882535.27355: in VariableManager get_vars() 18445 1726882535.27392: Calling all_inventory to load vars for managed_node1 18445 1726882535.27395: Calling groups_inventory to load vars for managed_node1 18445 1726882535.27398: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.27407: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.27410: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.27413: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.27581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.27781: done with get_vars() 18445 1726882535.27792: done getting variables 18445 1726882535.27848: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:35 -0400 (0:00:00.048) 0:00:07.215 ****** 18445 1726882535.27883: entering _queue_task() for managed_node1/package 18445 1726882535.28294: worker is 1 (out of 1 available) 18445 1726882535.28308: exiting _queue_task() for managed_node1/package 18445 1726882535.28322: done queuing things up, now waiting for results queue to drain 18445 1726882535.28324: waiting for pending results... 18445 1726882535.28587: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18445 1726882535.28683: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000064 18445 1726882535.28703: variable 'ansible_search_path' from source: unknown 18445 1726882535.28712: variable 'ansible_search_path' from source: unknown 18445 1726882535.28749: calling self._execute() 18445 1726882535.28830: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.28842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.28856: variable 'omit' from source: magic vars 18445 1726882535.29272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.31549: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.31595: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.31620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.31660: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.31680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.31735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.31758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.31775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.31801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.31811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.31899: variable 'ansible_distribution' from source: facts 18445 1726882535.31903: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.31916: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.31919: when evaluation is False, skipping this task 18445 1726882535.31923: _execute() done 18445 1726882535.31925: dumping result to json 18445 1726882535.31928: done dumping result, returning 18445 1726882535.31933: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-f6eb-935c-000000000064] 18445 1726882535.31943: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000064 18445 1726882535.32020: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000064 18445 1726882535.32023: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.32083: no more pending results, returning what we have 18445 1726882535.32086: results queue empty 18445 1726882535.32087: checking for any_errors_fatal 18445 1726882535.32091: done checking for any_errors_fatal 18445 1726882535.32092: checking for max_fail_percentage 18445 1726882535.32093: done checking for max_fail_percentage 18445 1726882535.32094: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.32095: done checking to see if all hosts have failed 18445 1726882535.32096: getting the remaining hosts for this loop 18445 1726882535.32097: done getting the remaining hosts for this loop 18445 1726882535.32100: getting the next task for host managed_node1 18445 1726882535.32105: done getting next task for host managed_node1 18445 1726882535.32108: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18445 1726882535.32110: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.32121: getting variables 18445 1726882535.32122: in VariableManager get_vars() 18445 1726882535.32149: Calling all_inventory to load vars for managed_node1 18445 1726882535.32150: Calling groups_inventory to load vars for managed_node1 18445 1726882535.32152: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.32160: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.32162: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.32165: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.32297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.32410: done with get_vars() 18445 1726882535.32417: done getting variables 18445 1726882535.32453: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:35 -0400 (0:00:00.045) 0:00:07.261 ****** 18445 1726882535.32477: entering _queue_task() for managed_node1/package 18445 1726882535.32628: worker is 1 (out of 1 available) 18445 1726882535.32640: exiting _queue_task() for managed_node1/package 18445 1726882535.32651: done queuing things up, now waiting for results queue to drain 18445 1726882535.32653: waiting for pending results... 18445 1726882535.32801: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18445 1726882535.32858: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000065 18445 1726882535.32878: variable 'ansible_search_path' from source: unknown 18445 1726882535.32881: variable 'ansible_search_path' from source: unknown 18445 1726882535.32905: calling self._execute() 18445 1726882535.32963: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.32971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.32979: variable 'omit' from source: magic vars 18445 1726882535.33257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.34731: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.34776: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.34801: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.34825: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.34847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.34899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.34918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.34935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.34968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.34978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.35065: variable 'ansible_distribution' from source: facts 18445 1726882535.35072: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.35088: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.35091: when evaluation is False, skipping this task 18445 1726882535.35094: _execute() done 18445 1726882535.35096: dumping result to json 18445 1726882535.35098: done dumping result, returning 18445 1726882535.35104: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-f6eb-935c-000000000065] 18445 1726882535.35109: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000065 18445 1726882535.35184: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000065 18445 1726882535.35187: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.35226: no more pending results, returning what we have 18445 1726882535.35230: results queue empty 18445 1726882535.35230: checking for any_errors_fatal 18445 1726882535.35235: done checking for any_errors_fatal 18445 1726882535.35236: checking for max_fail_percentage 18445 1726882535.35238: done checking for max_fail_percentage 18445 1726882535.35239: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.35240: done checking to see if all hosts have failed 18445 1726882535.35240: getting the remaining hosts for this loop 18445 1726882535.35242: done getting the remaining hosts for this loop 18445 1726882535.35245: getting the next task for host managed_node1 18445 1726882535.35249: done getting next task for host managed_node1 18445 1726882535.35253: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18445 1726882535.35254: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.35267: getting variables 18445 1726882535.35269: in VariableManager get_vars() 18445 1726882535.35298: Calling all_inventory to load vars for managed_node1 18445 1726882535.35300: Calling groups_inventory to load vars for managed_node1 18445 1726882535.35302: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.35309: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.35312: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.35314: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.35417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.35531: done with get_vars() 18445 1726882535.35537: done getting variables 18445 1726882535.35576: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:35 -0400 (0:00:00.031) 0:00:07.292 ****** 18445 1726882535.35595: entering _queue_task() for managed_node1/package 18445 1726882535.35751: worker is 1 (out of 1 available) 18445 1726882535.35762: exiting _queue_task() for managed_node1/package 18445 1726882535.35775: done queuing things up, now waiting for results queue to drain 18445 1726882535.35776: waiting for pending results... 18445 1726882535.35924: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18445 1726882535.35981: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000066 18445 1726882535.35991: variable 'ansible_search_path' from source: unknown 18445 1726882535.35996: variable 'ansible_search_path' from source: unknown 18445 1726882535.36020: calling self._execute() 18445 1726882535.36076: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.36080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.36088: variable 'omit' from source: magic vars 18445 1726882535.36361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.37868: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.37911: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.37936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.37972: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.37992: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.38044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.38067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.38085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.38111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.38124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.38216: variable 'ansible_distribution' from source: facts 18445 1726882535.38220: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.38232: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.38235: when evaluation is False, skipping this task 18445 1726882535.38237: _execute() done 18445 1726882535.38239: dumping result to json 18445 1726882535.38242: done dumping result, returning 18445 1726882535.38247: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-f6eb-935c-000000000066] 18445 1726882535.38252: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000066 18445 1726882535.38333: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000066 18445 1726882535.38336: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.38381: no more pending results, returning what we have 18445 1726882535.38384: results queue empty 18445 1726882535.38385: checking for any_errors_fatal 18445 1726882535.38390: done checking for any_errors_fatal 18445 1726882535.38390: checking for max_fail_percentage 18445 1726882535.38392: done checking for max_fail_percentage 18445 1726882535.38393: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.38394: done checking to see if all hosts have failed 18445 1726882535.38394: getting the remaining hosts for this loop 18445 1726882535.38396: done getting the remaining hosts for this loop 18445 1726882535.38399: getting the next task for host managed_node1 18445 1726882535.38403: done getting next task for host managed_node1 18445 1726882535.38406: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18445 1726882535.38408: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.38418: getting variables 18445 1726882535.38419: in VariableManager get_vars() 18445 1726882535.38449: Calling all_inventory to load vars for managed_node1 18445 1726882535.38452: Calling groups_inventory to load vars for managed_node1 18445 1726882535.38453: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.38459: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.38461: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.38463: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.38594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.38707: done with get_vars() 18445 1726882535.38712: done getting variables 18445 1726882535.38748: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:35 -0400 (0:00:00.031) 0:00:07.324 ****** 18445 1726882535.38768: entering _queue_task() for managed_node1/service 18445 1726882535.38917: worker is 1 (out of 1 available) 18445 1726882535.38929: exiting _queue_task() for managed_node1/service 18445 1726882535.38939: done queuing things up, now waiting for results queue to drain 18445 1726882535.38940: waiting for pending results... 18445 1726882535.39088: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18445 1726882535.39146: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000067 18445 1726882535.39159: variable 'ansible_search_path' from source: unknown 18445 1726882535.39164: variable 'ansible_search_path' from source: unknown 18445 1726882535.39193: calling self._execute() 18445 1726882535.39249: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.39252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.39261: variable 'omit' from source: magic vars 18445 1726882535.39541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.41032: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.41078: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.41103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.41128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.41146: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.41202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.41223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.41240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.41269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.41281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.41374: variable 'ansible_distribution' from source: facts 18445 1726882535.41378: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.41397: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.41401: when evaluation is False, skipping this task 18445 1726882535.41403: _execute() done 18445 1726882535.41406: dumping result to json 18445 1726882535.41408: done dumping result, returning 18445 1726882535.41413: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-f6eb-935c-000000000067] 18445 1726882535.41419: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000067 18445 1726882535.41497: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000067 18445 1726882535.41500: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.41541: no more pending results, returning what we have 18445 1726882535.41543: results queue empty 18445 1726882535.41544: checking for any_errors_fatal 18445 1726882535.41550: done checking for any_errors_fatal 18445 1726882535.41550: checking for max_fail_percentage 18445 1726882535.41552: done checking for max_fail_percentage 18445 1726882535.41553: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.41556: done checking to see if all hosts have failed 18445 1726882535.41557: getting the remaining hosts for this loop 18445 1726882535.41558: done getting the remaining hosts for this loop 18445 1726882535.41562: getting the next task for host managed_node1 18445 1726882535.41568: done getting next task for host managed_node1 18445 1726882535.41572: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18445 1726882535.41574: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.41584: getting variables 18445 1726882535.41586: in VariableManager get_vars() 18445 1726882535.41617: Calling all_inventory to load vars for managed_node1 18445 1726882535.41619: Calling groups_inventory to load vars for managed_node1 18445 1726882535.41621: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.41626: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.41628: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.41629: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.41736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.41850: done with get_vars() 18445 1726882535.41858: done getting variables 18445 1726882535.41897: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:35 -0400 (0:00:00.031) 0:00:07.356 ****** 18445 1726882535.41915: entering _queue_task() for managed_node1/service 18445 1726882535.42068: worker is 1 (out of 1 available) 18445 1726882535.42079: exiting _queue_task() for managed_node1/service 18445 1726882535.42089: done queuing things up, now waiting for results queue to drain 18445 1726882535.42091: waiting for pending results... 18445 1726882535.42229: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18445 1726882535.42284: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000068 18445 1726882535.42294: variable 'ansible_search_path' from source: unknown 18445 1726882535.42297: variable 'ansible_search_path' from source: unknown 18445 1726882535.42323: calling self._execute() 18445 1726882535.42374: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.42378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.42385: variable 'omit' from source: magic vars 18445 1726882535.42648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.44120: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.44168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.44194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.44378: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.44396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.44447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.44475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.44495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.44520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.44531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.44620: variable 'ansible_distribution' from source: facts 18445 1726882535.44623: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.44637: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.44640: when evaluation is False, skipping this task 18445 1726882535.44642: _execute() done 18445 1726882535.44645: dumping result to json 18445 1726882535.44647: done dumping result, returning 18445 1726882535.44652: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-f6eb-935c-000000000068] 18445 1726882535.44666: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000068 18445 1726882535.44743: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000068 18445 1726882535.44745: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18445 1726882535.44789: no more pending results, returning what we have 18445 1726882535.44792: results queue empty 18445 1726882535.44793: checking for any_errors_fatal 18445 1726882535.44797: done checking for any_errors_fatal 18445 1726882535.44797: checking for max_fail_percentage 18445 1726882535.44799: done checking for max_fail_percentage 18445 1726882535.44800: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.44801: done checking to see if all hosts have failed 18445 1726882535.44802: getting the remaining hosts for this loop 18445 1726882535.44803: done getting the remaining hosts for this loop 18445 1726882535.44806: getting the next task for host managed_node1 18445 1726882535.44810: done getting next task for host managed_node1 18445 1726882535.44813: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18445 1726882535.44815: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.44826: getting variables 18445 1726882535.44827: in VariableManager get_vars() 18445 1726882535.44857: Calling all_inventory to load vars for managed_node1 18445 1726882535.44859: Calling groups_inventory to load vars for managed_node1 18445 1726882535.44860: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.44869: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.44871: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.44873: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.44973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.45118: done with get_vars() 18445 1726882535.45124: done getting variables 18445 1726882535.45161: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:35 -0400 (0:00:00.032) 0:00:07.388 ****** 18445 1726882535.45181: entering _queue_task() for managed_node1/service 18445 1726882535.45326: worker is 1 (out of 1 available) 18445 1726882535.45337: exiting _queue_task() for managed_node1/service 18445 1726882535.45347: done queuing things up, now waiting for results queue to drain 18445 1726882535.45348: waiting for pending results... 18445 1726882535.45492: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18445 1726882535.45546: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000069 18445 1726882535.45558: variable 'ansible_search_path' from source: unknown 18445 1726882535.45562: variable 'ansible_search_path' from source: unknown 18445 1726882535.45585: calling self._execute() 18445 1726882535.45635: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.45638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.45646: variable 'omit' from source: magic vars 18445 1726882535.45931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.47833: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.47876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.47900: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.47924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.47944: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.48003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.48023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.48039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.48071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.48082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.48176: variable 'ansible_distribution' from source: facts 18445 1726882535.48202: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.48221: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.48228: when evaluation is False, skipping this task 18445 1726882535.48234: _execute() done 18445 1726882535.48238: dumping result to json 18445 1726882535.48244: done dumping result, returning 18445 1726882535.48252: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-f6eb-935c-000000000069] 18445 1726882535.48260: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000069 18445 1726882535.48354: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000069 18445 1726882535.48361: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.48578: no more pending results, returning what we have 18445 1726882535.48581: results queue empty 18445 1726882535.48582: checking for any_errors_fatal 18445 1726882535.48588: done checking for any_errors_fatal 18445 1726882535.48589: checking for max_fail_percentage 18445 1726882535.48590: done checking for max_fail_percentage 18445 1726882535.48591: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.48592: done checking to see if all hosts have failed 18445 1726882535.48593: getting the remaining hosts for this loop 18445 1726882535.48594: done getting the remaining hosts for this loop 18445 1726882535.48597: getting the next task for host managed_node1 18445 1726882535.48601: done getting next task for host managed_node1 18445 1726882535.48605: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18445 1726882535.48607: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.48617: getting variables 18445 1726882535.48619: in VariableManager get_vars() 18445 1726882535.48649: Calling all_inventory to load vars for managed_node1 18445 1726882535.48651: Calling groups_inventory to load vars for managed_node1 18445 1726882535.48653: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.48661: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.48666: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.48669: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.48827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.49028: done with get_vars() 18445 1726882535.49037: done getting variables 18445 1726882535.49089: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:35 -0400 (0:00:00.039) 0:00:07.428 ****** 18445 1726882535.49113: entering _queue_task() for managed_node1/service 18445 1726882535.49302: worker is 1 (out of 1 available) 18445 1726882535.49313: exiting _queue_task() for managed_node1/service 18445 1726882535.49323: done queuing things up, now waiting for results queue to drain 18445 1726882535.49325: waiting for pending results... 18445 1726882535.49560: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18445 1726882535.49649: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000006a 18445 1726882535.49670: variable 'ansible_search_path' from source: unknown 18445 1726882535.49681: variable 'ansible_search_path' from source: unknown 18445 1726882535.49718: calling self._execute() 18445 1726882535.49792: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.49803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.49816: variable 'omit' from source: magic vars 18445 1726882535.50208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.52560: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.52603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.52627: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.52654: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.52677: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.52745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.52770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.52786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.52811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.52822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.52913: variable 'ansible_distribution' from source: facts 18445 1726882535.52917: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.52933: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.52936: when evaluation is False, skipping this task 18445 1726882535.52938: _execute() done 18445 1726882535.52941: dumping result to json 18445 1726882535.52943: done dumping result, returning 18445 1726882535.52950: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-f6eb-935c-00000000006a] 18445 1726882535.52958: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006a 18445 1726882535.53032: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006a 18445 1726882535.53035: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18445 1726882535.53094: no more pending results, returning what we have 18445 1726882535.53097: results queue empty 18445 1726882535.53098: checking for any_errors_fatal 18445 1726882535.53103: done checking for any_errors_fatal 18445 1726882535.53104: checking for max_fail_percentage 18445 1726882535.53105: done checking for max_fail_percentage 18445 1726882535.53106: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.53107: done checking to see if all hosts have failed 18445 1726882535.53108: getting the remaining hosts for this loop 18445 1726882535.53109: done getting the remaining hosts for this loop 18445 1726882535.53112: getting the next task for host managed_node1 18445 1726882535.53117: done getting next task for host managed_node1 18445 1726882535.53121: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18445 1726882535.53123: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.53134: getting variables 18445 1726882535.53135: in VariableManager get_vars() 18445 1726882535.53168: Calling all_inventory to load vars for managed_node1 18445 1726882535.53170: Calling groups_inventory to load vars for managed_node1 18445 1726882535.53173: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.53180: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.53183: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.53185: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.53325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.53438: done with get_vars() 18445 1726882535.53445: done getting variables 18445 1726882535.53485: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:35 -0400 (0:00:00.043) 0:00:07.472 ****** 18445 1726882535.53505: entering _queue_task() for managed_node1/copy 18445 1726882535.53669: worker is 1 (out of 1 available) 18445 1726882535.53682: exiting _queue_task() for managed_node1/copy 18445 1726882535.53694: done queuing things up, now waiting for results queue to drain 18445 1726882535.53695: waiting for pending results... 18445 1726882535.53844: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18445 1726882535.53899: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000006b 18445 1726882535.53909: variable 'ansible_search_path' from source: unknown 18445 1726882535.53912: variable 'ansible_search_path' from source: unknown 18445 1726882535.53938: calling self._execute() 18445 1726882535.54017: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.54029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.54044: variable 'omit' from source: magic vars 18445 1726882535.54486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.57297: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.57378: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.57415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.57467: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.57497: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.57590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.57624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.57668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.57713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.57730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.57882: variable 'ansible_distribution' from source: facts 18445 1726882535.57893: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.57915: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.57924: when evaluation is False, skipping this task 18445 1726882535.57930: _execute() done 18445 1726882535.57935: dumping result to json 18445 1726882535.57941: done dumping result, returning 18445 1726882535.57951: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-f6eb-935c-00000000006b] 18445 1726882535.57969: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.58124: no more pending results, returning what we have 18445 1726882535.58128: results queue empty 18445 1726882535.58128: checking for any_errors_fatal 18445 1726882535.58135: done checking for any_errors_fatal 18445 1726882535.58135: checking for max_fail_percentage 18445 1726882535.58137: done checking for max_fail_percentage 18445 1726882535.58138: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.58139: done checking to see if all hosts have failed 18445 1726882535.58140: getting the remaining hosts for this loop 18445 1726882535.58144: done getting the remaining hosts for this loop 18445 1726882535.58147: getting the next task for host managed_node1 18445 1726882535.58153: done getting next task for host managed_node1 18445 1726882535.58158: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18445 1726882535.58161: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.58178: getting variables 18445 1726882535.58180: in VariableManager get_vars() 18445 1726882535.58219: Calling all_inventory to load vars for managed_node1 18445 1726882535.58221: Calling groups_inventory to load vars for managed_node1 18445 1726882535.58224: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.58234: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.58237: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.58240: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.58414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.58633: done with get_vars() 18445 1726882535.58643: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:35 -0400 (0:00:00.052) 0:00:07.524 ****** 18445 1726882535.58736: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18445 1726882535.58757: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006b 18445 1726882535.58766: WORKER PROCESS EXITING 18445 1726882535.59277: worker is 1 (out of 1 available) 18445 1726882535.59289: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18445 1726882535.59300: done queuing things up, now waiting for results queue to drain 18445 1726882535.59302: waiting for pending results... 18445 1726882535.60313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18445 1726882535.60438: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000006c 18445 1726882535.60476: variable 'ansible_search_path' from source: unknown 18445 1726882535.60484: variable 'ansible_search_path' from source: unknown 18445 1726882535.60520: calling self._execute() 18445 1726882535.60616: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.60626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.60638: variable 'omit' from source: magic vars 18445 1726882535.61081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.64678: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.64749: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.64795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.64835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.64879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.64959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.64995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.65023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.65071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.65089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.65220: variable 'ansible_distribution' from source: facts 18445 1726882535.65234: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.65258: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.65268: when evaluation is False, skipping this task 18445 1726882535.65274: _execute() done 18445 1726882535.65280: dumping result to json 18445 1726882535.65286: done dumping result, returning 18445 1726882535.65296: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-f6eb-935c-00000000006c] 18445 1726882535.65304: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.65456: no more pending results, returning what we have 18445 1726882535.65460: results queue empty 18445 1726882535.65460: checking for any_errors_fatal 18445 1726882535.65468: done checking for any_errors_fatal 18445 1726882535.65469: checking for max_fail_percentage 18445 1726882535.65470: done checking for max_fail_percentage 18445 1726882535.65471: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.65472: done checking to see if all hosts have failed 18445 1726882535.65473: getting the remaining hosts for this loop 18445 1726882535.65474: done getting the remaining hosts for this loop 18445 1726882535.65477: getting the next task for host managed_node1 18445 1726882535.65483: done getting next task for host managed_node1 18445 1726882535.65486: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18445 1726882535.65488: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.65499: getting variables 18445 1726882535.65500: in VariableManager get_vars() 18445 1726882535.65535: Calling all_inventory to load vars for managed_node1 18445 1726882535.65537: Calling groups_inventory to load vars for managed_node1 18445 1726882535.65539: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.65548: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.65551: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.65556: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.65771: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006c 18445 1726882535.65774: WORKER PROCESS EXITING 18445 1726882535.65781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.65987: done with get_vars() 18445 1726882535.65997: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:35 -0400 (0:00:00.073) 0:00:07.597 ****** 18445 1726882535.66073: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18445 1726882535.66704: worker is 1 (out of 1 available) 18445 1726882535.66714: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18445 1726882535.66725: done queuing things up, now waiting for results queue to drain 18445 1726882535.66726: waiting for pending results... 18445 1726882535.67557: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18445 1726882535.67652: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000006d 18445 1726882535.67747: variable 'ansible_search_path' from source: unknown 18445 1726882535.67756: variable 'ansible_search_path' from source: unknown 18445 1726882535.67794: calling self._execute() 18445 1726882535.67986: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.67997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.68011: variable 'omit' from source: magic vars 18445 1726882535.68886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.72517: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.72601: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.72645: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.72689: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.72721: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.72810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.72842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.72882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.72928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.72949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.73096: variable 'ansible_distribution' from source: facts 18445 1726882535.73107: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.73129: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.73137: when evaluation is False, skipping this task 18445 1726882535.73144: _execute() done 18445 1726882535.73149: dumping result to json 18445 1726882535.73161: done dumping result, returning 18445 1726882535.73174: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-f6eb-935c-00000000006d] 18445 1726882535.73192: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006d skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882535.73333: no more pending results, returning what we have 18445 1726882535.73337: results queue empty 18445 1726882535.73338: checking for any_errors_fatal 18445 1726882535.73344: done checking for any_errors_fatal 18445 1726882535.73345: checking for max_fail_percentage 18445 1726882535.73346: done checking for max_fail_percentage 18445 1726882535.73347: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.73348: done checking to see if all hosts have failed 18445 1726882535.73349: getting the remaining hosts for this loop 18445 1726882535.73351: done getting the remaining hosts for this loop 18445 1726882535.73356: getting the next task for host managed_node1 18445 1726882535.73362: done getting next task for host managed_node1 18445 1726882535.73366: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18445 1726882535.73368: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.73380: getting variables 18445 1726882535.73381: in VariableManager get_vars() 18445 1726882535.73415: Calling all_inventory to load vars for managed_node1 18445 1726882535.73418: Calling groups_inventory to load vars for managed_node1 18445 1726882535.73421: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.73430: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.73433: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.73436: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.73619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.73828: done with get_vars() 18445 1726882535.73839: done getting variables 18445 1726882535.73910: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:35 -0400 (0:00:00.078) 0:00:07.676 ****** 18445 1726882535.73943: entering _queue_task() for managed_node1/debug 18445 1726882535.73966: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006d 18445 1726882535.73973: WORKER PROCESS EXITING 18445 1726882535.74497: worker is 1 (out of 1 available) 18445 1726882535.74508: exiting _queue_task() for managed_node1/debug 18445 1726882535.74519: done queuing things up, now waiting for results queue to drain 18445 1726882535.74521: waiting for pending results... 18445 1726882535.75078: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18445 1726882535.75173: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000006e 18445 1726882535.75191: variable 'ansible_search_path' from source: unknown 18445 1726882535.75198: variable 'ansible_search_path' from source: unknown 18445 1726882535.75238: calling self._execute() 18445 1726882535.75319: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.75335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.75349: variable 'omit' from source: magic vars 18445 1726882535.75798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.80117: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.80189: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.80229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.80284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.80316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.80402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.80437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.80471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.80517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.80537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.80678: variable 'ansible_distribution' from source: facts 18445 1726882535.80778: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.81088: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.81097: when evaluation is False, skipping this task 18445 1726882535.81104: _execute() done 18445 1726882535.81110: dumping result to json 18445 1726882535.81117: done dumping result, returning 18445 1726882535.81128: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-f6eb-935c-00000000006e] 18445 1726882535.81138: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006e skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882535.81277: no more pending results, returning what we have 18445 1726882535.81281: results queue empty 18445 1726882535.81282: checking for any_errors_fatal 18445 1726882535.81287: done checking for any_errors_fatal 18445 1726882535.81288: checking for max_fail_percentage 18445 1726882535.81289: done checking for max_fail_percentage 18445 1726882535.81290: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.81291: done checking to see if all hosts have failed 18445 1726882535.81292: getting the remaining hosts for this loop 18445 1726882535.81293: done getting the remaining hosts for this loop 18445 1726882535.81296: getting the next task for host managed_node1 18445 1726882535.81302: done getting next task for host managed_node1 18445 1726882535.81305: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18445 1726882535.81307: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.81322: getting variables 18445 1726882535.81324: in VariableManager get_vars() 18445 1726882535.81361: Calling all_inventory to load vars for managed_node1 18445 1726882535.81365: Calling groups_inventory to load vars for managed_node1 18445 1726882535.81368: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.81377: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.81380: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.81383: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.81539: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006e 18445 1726882535.81542: WORKER PROCESS EXITING 18445 1726882535.81568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.82165: done with get_vars() 18445 1726882535.82175: done getting variables 18445 1726882535.83251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:35 -0400 (0:00:00.093) 0:00:07.769 ****** 18445 1726882535.83284: entering _queue_task() for managed_node1/debug 18445 1726882535.84213: worker is 1 (out of 1 available) 18445 1726882535.84225: exiting _queue_task() for managed_node1/debug 18445 1726882535.84237: done queuing things up, now waiting for results queue to drain 18445 1726882535.84239: waiting for pending results... 18445 1726882535.85132: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18445 1726882535.85208: in run() - task 0e448fcc-3ce9-f6eb-935c-00000000006f 18445 1726882535.85337: variable 'ansible_search_path' from source: unknown 18445 1726882535.85341: variable 'ansible_search_path' from source: unknown 18445 1726882535.85374: calling self._execute() 18445 1726882535.85560: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.85567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.85576: variable 'omit' from source: magic vars 18445 1726882535.86507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882535.92141: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882535.92437: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882535.92483: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882535.92516: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882535.92544: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882535.92758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882535.92798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882535.92830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882535.93014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882535.93035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882535.93879: variable 'ansible_distribution' from source: facts 18445 1726882535.93891: variable 'ansible_distribution_major_version' from source: facts 18445 1726882535.93914: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882535.93922: when evaluation is False, skipping this task 18445 1726882535.93928: _execute() done 18445 1726882535.93934: dumping result to json 18445 1726882535.93942: done dumping result, returning 18445 1726882535.93953: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-f6eb-935c-00000000006f] 18445 1726882535.93963: sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006f 18445 1726882535.94072: done sending task result for task 0e448fcc-3ce9-f6eb-935c-00000000006f skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882535.94122: no more pending results, returning what we have 18445 1726882535.94125: results queue empty 18445 1726882535.94126: checking for any_errors_fatal 18445 1726882535.94132: done checking for any_errors_fatal 18445 1726882535.94133: checking for max_fail_percentage 18445 1726882535.94134: done checking for max_fail_percentage 18445 1726882535.94135: checking to see if all hosts have failed and the running result is not ok 18445 1726882535.94136: done checking to see if all hosts have failed 18445 1726882535.94137: getting the remaining hosts for this loop 18445 1726882535.94138: done getting the remaining hosts for this loop 18445 1726882535.94141: getting the next task for host managed_node1 18445 1726882535.94147: done getting next task for host managed_node1 18445 1726882535.94151: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18445 1726882535.94153: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882535.94168: WORKER PROCESS EXITING 18445 1726882535.94176: getting variables 18445 1726882535.94178: in VariableManager get_vars() 18445 1726882535.94216: Calling all_inventory to load vars for managed_node1 18445 1726882535.94218: Calling groups_inventory to load vars for managed_node1 18445 1726882535.94220: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882535.94229: Calling all_plugins_play to load vars for managed_node1 18445 1726882535.94232: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882535.94234: Calling groups_plugins_play to load vars for managed_node1 18445 1726882535.94411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882535.94639: done with get_vars() 18445 1726882535.94652: done getting variables 18445 1726882535.94721: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:35 -0400 (0:00:00.114) 0:00:07.884 ****** 18445 1726882535.94752: entering _queue_task() for managed_node1/debug 18445 1726882535.95368: worker is 1 (out of 1 available) 18445 1726882535.95381: exiting _queue_task() for managed_node1/debug 18445 1726882535.95392: done queuing things up, now waiting for results queue to drain 18445 1726882535.95394: waiting for pending results... 18445 1726882535.96125: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18445 1726882535.96220: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000070 18445 1726882535.96587: variable 'ansible_search_path' from source: unknown 18445 1726882535.96594: variable 'ansible_search_path' from source: unknown 18445 1726882535.96629: calling self._execute() 18445 1726882535.96706: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882535.96717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882535.96729: variable 'omit' from source: magic vars 18445 1726882535.97325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882536.01538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882536.01609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882536.01653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882536.01710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882536.01745: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882536.01825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882536.01890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882536.01924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882536.01977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882536.01998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882536.02161: variable 'ansible_distribution' from source: facts 18445 1726882536.02176: variable 'ansible_distribution_major_version' from source: facts 18445 1726882536.02197: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882536.02203: when evaluation is False, skipping this task 18445 1726882536.02209: _execute() done 18445 1726882536.02214: dumping result to json 18445 1726882536.02221: done dumping result, returning 18445 1726882536.02230: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-f6eb-935c-000000000070] 18445 1726882536.02240: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000070 18445 1726882536.02349: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000070 18445 1726882536.02357: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 18445 1726882536.02416: no more pending results, returning what we have 18445 1726882536.02420: results queue empty 18445 1726882536.02420: checking for any_errors_fatal 18445 1726882536.02425: done checking for any_errors_fatal 18445 1726882536.02425: checking for max_fail_percentage 18445 1726882536.02427: done checking for max_fail_percentage 18445 1726882536.02428: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.02429: done checking to see if all hosts have failed 18445 1726882536.02430: getting the remaining hosts for this loop 18445 1726882536.02431: done getting the remaining hosts for this loop 18445 1726882536.02434: getting the next task for host managed_node1 18445 1726882536.02439: done getting next task for host managed_node1 18445 1726882536.02443: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18445 1726882536.02445: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.02458: getting variables 18445 1726882536.02460: in VariableManager get_vars() 18445 1726882536.02496: Calling all_inventory to load vars for managed_node1 18445 1726882536.02498: Calling groups_inventory to load vars for managed_node1 18445 1726882536.02501: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.02510: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.02513: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.02515: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.02743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.02941: done with get_vars() 18445 1726882536.02951: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:36 -0400 (0:00:00.082) 0:00:07.967 ****** 18445 1726882536.03041: entering _queue_task() for managed_node1/ping 18445 1726882536.03287: worker is 1 (out of 1 available) 18445 1726882536.03305: exiting _queue_task() for managed_node1/ping 18445 1726882536.03315: done queuing things up, now waiting for results queue to drain 18445 1726882536.03317: waiting for pending results... 18445 1726882536.04100: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18445 1726882536.04312: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000071 18445 1726882536.04451: variable 'ansible_search_path' from source: unknown 18445 1726882536.04465: variable 'ansible_search_path' from source: unknown 18445 1726882536.04508: calling self._execute() 18445 1726882536.04717: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882536.04774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882536.04790: variable 'omit' from source: magic vars 18445 1726882536.05687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882536.09091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882536.09173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882536.09215: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882536.09267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882536.09310: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882536.09396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882536.09429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882536.09469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882536.09515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882536.09558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882536.09700: variable 'ansible_distribution' from source: facts 18445 1726882536.09712: variable 'ansible_distribution_major_version' from source: facts 18445 1726882536.09733: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882536.09741: when evaluation is False, skipping this task 18445 1726882536.09748: _execute() done 18445 1726882536.09757: dumping result to json 18445 1726882536.09769: done dumping result, returning 18445 1726882536.09782: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-f6eb-935c-000000000071] 18445 1726882536.09792: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000071 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882536.09926: no more pending results, returning what we have 18445 1726882536.09930: results queue empty 18445 1726882536.09931: checking for any_errors_fatal 18445 1726882536.09937: done checking for any_errors_fatal 18445 1726882536.09937: checking for max_fail_percentage 18445 1726882536.09939: done checking for max_fail_percentage 18445 1726882536.09940: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.09941: done checking to see if all hosts have failed 18445 1726882536.09942: getting the remaining hosts for this loop 18445 1726882536.09944: done getting the remaining hosts for this loop 18445 1726882536.09947: getting the next task for host managed_node1 18445 1726882536.09957: done getting next task for host managed_node1 18445 1726882536.09960: ^ task is: TASK: meta (role_complete) 18445 1726882536.09962: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.09977: getting variables 18445 1726882536.09979: in VariableManager get_vars() 18445 1726882536.10020: Calling all_inventory to load vars for managed_node1 18445 1726882536.10023: Calling groups_inventory to load vars for managed_node1 18445 1726882536.10026: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.10037: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.10040: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.10043: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.10231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.10471: done with get_vars() 18445 1726882536.10490: done getting variables 18445 1726882536.10579: done queuing things up, now waiting for results queue to drain 18445 1726882536.10581: results queue empty 18445 1726882536.10582: checking for any_errors_fatal 18445 1726882536.10583: done checking for any_errors_fatal 18445 1726882536.10584: checking for max_fail_percentage 18445 1726882536.10585: done checking for max_fail_percentage 18445 1726882536.10585: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.10586: done checking to see if all hosts have failed 18445 1726882536.10587: getting the remaining hosts for this loop 18445 1726882536.10588: done getting the remaining hosts for this loop 18445 1726882536.10590: getting the next task for host managed_node1 18445 1726882536.10593: done getting next task for host managed_node1 18445 1726882536.10594: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.10595: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.10598: getting variables 18445 1726882536.10599: in VariableManager get_vars() 18445 1726882536.10610: Calling all_inventory to load vars for managed_node1 18445 1726882536.10612: Calling groups_inventory to load vars for managed_node1 18445 1726882536.10614: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.10618: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.10620: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.10623: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.11062: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000071 18445 1726882536.11068: WORKER PROCESS EXITING 18445 1726882536.11086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.11315: done with get_vars() 18445 1726882536.11330: done getting variables 18445 1726882536.11378: in VariableManager get_vars() 18445 1726882536.11388: Calling all_inventory to load vars for managed_node1 18445 1726882536.11395: Calling groups_inventory to load vars for managed_node1 18445 1726882536.11397: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.11401: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.11404: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.11407: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.11561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.11759: done with get_vars() 18445 1726882536.11774: done queuing things up, now waiting for results queue to drain 18445 1726882536.11775: results queue empty 18445 1726882536.11776: checking for any_errors_fatal 18445 1726882536.11777: done checking for any_errors_fatal 18445 1726882536.11778: checking for max_fail_percentage 18445 1726882536.11779: done checking for max_fail_percentage 18445 1726882536.11780: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.11780: done checking to see if all hosts have failed 18445 1726882536.11781: getting the remaining hosts for this loop 18445 1726882536.11782: done getting the remaining hosts for this loop 18445 1726882536.11784: getting the next task for host managed_node1 18445 1726882536.11787: done getting next task for host managed_node1 18445 1726882536.11789: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.11790: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.11792: getting variables 18445 1726882536.11793: in VariableManager get_vars() 18445 1726882536.11802: Calling all_inventory to load vars for managed_node1 18445 1726882536.11804: Calling groups_inventory to load vars for managed_node1 18445 1726882536.11806: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.11810: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.11813: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.11815: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.11948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.12188: done with get_vars() 18445 1726882536.12195: done getting variables 18445 1726882536.12235: in VariableManager get_vars() 18445 1726882536.12245: Calling all_inventory to load vars for managed_node1 18445 1726882536.12247: Calling groups_inventory to load vars for managed_node1 18445 1726882536.12249: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.12253: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.12258: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.12260: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.12418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.12626: done with get_vars() 18445 1726882536.12637: done queuing things up, now waiting for results queue to drain 18445 1726882536.12638: results queue empty 18445 1726882536.12639: checking for any_errors_fatal 18445 1726882536.12640: done checking for any_errors_fatal 18445 1726882536.12641: checking for max_fail_percentage 18445 1726882536.12642: done checking for max_fail_percentage 18445 1726882536.12643: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.12643: done checking to see if all hosts have failed 18445 1726882536.12644: getting the remaining hosts for this loop 18445 1726882536.12645: done getting the remaining hosts for this loop 18445 1726882536.12647: getting the next task for host managed_node1 18445 1726882536.12649: done getting next task for host managed_node1 18445 1726882536.12650: ^ task is: None 18445 1726882536.12652: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.12653: done queuing things up, now waiting for results queue to drain 18445 1726882536.12656: results queue empty 18445 1726882536.12657: checking for any_errors_fatal 18445 1726882536.12657: done checking for any_errors_fatal 18445 1726882536.12658: checking for max_fail_percentage 18445 1726882536.12659: done checking for max_fail_percentage 18445 1726882536.12660: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.12660: done checking to see if all hosts have failed 18445 1726882536.12662: getting the next task for host managed_node1 18445 1726882536.12666: done getting next task for host managed_node1 18445 1726882536.12667: ^ task is: None 18445 1726882536.12668: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.12698: in VariableManager get_vars() 18445 1726882536.12712: done with get_vars() 18445 1726882536.12717: in VariableManager get_vars() 18445 1726882536.12726: done with get_vars() 18445 1726882536.12730: variable 'omit' from source: magic vars 18445 1726882536.12760: in VariableManager get_vars() 18445 1726882536.12772: done with get_vars() 18445 1726882536.12790: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 18445 1726882536.12974: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882536.12996: getting the remaining hosts for this loop 18445 1726882536.12997: done getting the remaining hosts for this loop 18445 1726882536.12999: getting the next task for host managed_node1 18445 1726882536.13002: done getting next task for host managed_node1 18445 1726882536.13003: ^ task is: TASK: Gathering Facts 18445 1726882536.13005: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.13007: getting variables 18445 1726882536.13008: in VariableManager get_vars() 18445 1726882536.13015: Calling all_inventory to load vars for managed_node1 18445 1726882536.13017: Calling groups_inventory to load vars for managed_node1 18445 1726882536.13019: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.13023: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.13025: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.13028: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.13167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.13347: done with get_vars() 18445 1726882536.13357: done getting variables 18445 1726882536.13395: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 21:35:36 -0400 (0:00:00.103) 0:00:08.071 ****** 18445 1726882536.13421: entering _queue_task() for managed_node1/gather_facts 18445 1726882536.13772: worker is 1 (out of 1 available) 18445 1726882536.13781: exiting _queue_task() for managed_node1/gather_facts 18445 1726882536.13793: done queuing things up, now waiting for results queue to drain 18445 1726882536.13794: waiting for pending results... 18445 1726882536.14044: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882536.14284: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000002cb 18445 1726882536.14302: variable 'ansible_search_path' from source: unknown 18445 1726882536.14327: calling self._execute() 18445 1726882536.14386: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882536.14397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882536.14408: variable 'omit' from source: magic vars 18445 1726882536.14796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882536.18849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882536.19047: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882536.19095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882536.19249: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882536.19287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882536.19487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882536.19521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882536.19552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882536.19606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882536.19688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882536.19941: variable 'ansible_distribution' from source: facts 18445 1726882536.20007: variable 'ansible_distribution_major_version' from source: facts 18445 1726882536.20029: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882536.20111: when evaluation is False, skipping this task 18445 1726882536.20119: _execute() done 18445 1726882536.20126: dumping result to json 18445 1726882536.20133: done dumping result, returning 18445 1726882536.20143: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-0000000002cb] 18445 1726882536.20157: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000002cb skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882536.20398: no more pending results, returning what we have 18445 1726882536.20402: results queue empty 18445 1726882536.20403: checking for any_errors_fatal 18445 1726882536.20404: done checking for any_errors_fatal 18445 1726882536.20405: checking for max_fail_percentage 18445 1726882536.20407: done checking for max_fail_percentage 18445 1726882536.20408: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.20409: done checking to see if all hosts have failed 18445 1726882536.20409: getting the remaining hosts for this loop 18445 1726882536.20411: done getting the remaining hosts for this loop 18445 1726882536.20415: getting the next task for host managed_node1 18445 1726882536.20421: done getting next task for host managed_node1 18445 1726882536.20423: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.20425: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.20428: getting variables 18445 1726882536.20429: in VariableManager get_vars() 18445 1726882536.20456: Calling all_inventory to load vars for managed_node1 18445 1726882536.20459: Calling groups_inventory to load vars for managed_node1 18445 1726882536.20463: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.20475: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.20478: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.20482: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.20645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.20839: done with get_vars() 18445 1726882536.20849: done getting variables 18445 1726882536.21382: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000002cb 18445 1726882536.21385: WORKER PROCESS EXITING 18445 1726882536.21436: in VariableManager get_vars() 18445 1726882536.21445: Calling all_inventory to load vars for managed_node1 18445 1726882536.21447: Calling groups_inventory to load vars for managed_node1 18445 1726882536.21449: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.21456: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.21458: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.21461: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.21604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.21791: done with get_vars() 18445 1726882536.21804: done queuing things up, now waiting for results queue to drain 18445 1726882536.21806: results queue empty 18445 1726882536.21807: checking for any_errors_fatal 18445 1726882536.21809: done checking for any_errors_fatal 18445 1726882536.21809: checking for max_fail_percentage 18445 1726882536.21810: done checking for max_fail_percentage 18445 1726882536.21811: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.21812: done checking to see if all hosts have failed 18445 1726882536.21813: getting the remaining hosts for this loop 18445 1726882536.21814: done getting the remaining hosts for this loop 18445 1726882536.21816: getting the next task for host managed_node1 18445 1726882536.21819: done getting next task for host managed_node1 18445 1726882536.21822: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 18445 1726882536.21823: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.21825: getting variables 18445 1726882536.21826: in VariableManager get_vars() 18445 1726882536.21834: Calling all_inventory to load vars for managed_node1 18445 1726882536.21836: Calling groups_inventory to load vars for managed_node1 18445 1726882536.21838: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.21847: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.21849: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.21852: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.21989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.22876: done with get_vars() 18445 1726882536.22884: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 21:35:36 -0400 (0:00:00.095) 0:00:08.166 ****** 18445 1726882536.22956: entering _queue_task() for managed_node1/include_tasks 18445 1726882536.23441: worker is 1 (out of 1 available) 18445 1726882536.23452: exiting _queue_task() for managed_node1/include_tasks 18445 1726882536.23473: done queuing things up, now waiting for results queue to drain 18445 1726882536.23475: waiting for pending results... 18445 1726882536.24509: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 18445 1726882536.24783: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000074 18445 1726882536.24840: variable 'ansible_search_path' from source: unknown 18445 1726882536.24882: calling self._execute() 18445 1726882536.25144: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882536.25155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882536.25179: variable 'omit' from source: magic vars 18445 1726882536.25909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882536.30402: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882536.30576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882536.30621: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882536.30677: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882536.30709: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882536.30796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882536.30833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882536.30872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882536.30922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882536.30942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882536.31247: variable 'ansible_distribution' from source: facts 18445 1726882536.31259: variable 'ansible_distribution_major_version' from source: facts 18445 1726882536.31288: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882536.31296: when evaluation is False, skipping this task 18445 1726882536.31303: _execute() done 18445 1726882536.31309: dumping result to json 18445 1726882536.31316: done dumping result, returning 18445 1726882536.31326: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [0e448fcc-3ce9-f6eb-935c-000000000074] 18445 1726882536.31353: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000074 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882536.31538: no more pending results, returning what we have 18445 1726882536.31543: results queue empty 18445 1726882536.31544: checking for any_errors_fatal 18445 1726882536.31546: done checking for any_errors_fatal 18445 1726882536.31547: checking for max_fail_percentage 18445 1726882536.31548: done checking for max_fail_percentage 18445 1726882536.31549: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.31550: done checking to see if all hosts have failed 18445 1726882536.31551: getting the remaining hosts for this loop 18445 1726882536.31553: done getting the remaining hosts for this loop 18445 1726882536.31557: getting the next task for host managed_node1 18445 1726882536.31565: done getting next task for host managed_node1 18445 1726882536.31568: ^ task is: TASK: Include the task 'assert_device_absent.yml' 18445 1726882536.31570: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.31573: getting variables 18445 1726882536.31575: in VariableManager get_vars() 18445 1726882536.31605: Calling all_inventory to load vars for managed_node1 18445 1726882536.31608: Calling groups_inventory to load vars for managed_node1 18445 1726882536.31612: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.31622: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.31625: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.31628: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.31812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.32062: done with get_vars() 18445 1726882536.32074: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 21:35:36 -0400 (0:00:00.092) 0:00:08.258 ****** 18445 1726882536.32178: entering _queue_task() for managed_node1/include_tasks 18445 1726882536.32204: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000074 18445 1726882536.32312: WORKER PROCESS EXITING 18445 1726882536.33038: worker is 1 (out of 1 available) 18445 1726882536.33049: exiting _queue_task() for managed_node1/include_tasks 18445 1726882536.33060: done queuing things up, now waiting for results queue to drain 18445 1726882536.33180: waiting for pending results... 18445 1726882536.33747: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 18445 1726882536.33958: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000075 18445 1726882536.33980: variable 'ansible_search_path' from source: unknown 18445 1726882536.34022: calling self._execute() 18445 1726882536.34206: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882536.34279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882536.34295: variable 'omit' from source: magic vars 18445 1726882536.35211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882536.38610: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882536.38683: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882536.38722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882536.38761: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882536.38899: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882536.38976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882536.39098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882536.39133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882536.39210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882536.39286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882536.39530: variable 'ansible_distribution' from source: facts 18445 1726882536.39556: variable 'ansible_distribution_major_version' from source: facts 18445 1726882536.39678: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882536.39686: when evaluation is False, skipping this task 18445 1726882536.39692: _execute() done 18445 1726882536.39699: dumping result to json 18445 1726882536.39706: done dumping result, returning 18445 1726882536.39717: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0e448fcc-3ce9-f6eb-935c-000000000075] 18445 1726882536.39727: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000075 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882536.39869: no more pending results, returning what we have 18445 1726882536.39873: results queue empty 18445 1726882536.39874: checking for any_errors_fatal 18445 1726882536.39880: done checking for any_errors_fatal 18445 1726882536.39881: checking for max_fail_percentage 18445 1726882536.39882: done checking for max_fail_percentage 18445 1726882536.39883: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.39884: done checking to see if all hosts have failed 18445 1726882536.39885: getting the remaining hosts for this loop 18445 1726882536.39887: done getting the remaining hosts for this loop 18445 1726882536.39891: getting the next task for host managed_node1 18445 1726882536.39898: done getting next task for host managed_node1 18445 1726882536.39900: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.39902: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.39906: getting variables 18445 1726882536.39908: in VariableManager get_vars() 18445 1726882536.39937: Calling all_inventory to load vars for managed_node1 18445 1726882536.39940: Calling groups_inventory to load vars for managed_node1 18445 1726882536.39943: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.39953: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.39957: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.39960: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.40141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.40332: done with get_vars() 18445 1726882536.40343: done getting variables 18445 1726882536.40416: in VariableManager get_vars() 18445 1726882536.40424: Calling all_inventory to load vars for managed_node1 18445 1726882536.40427: Calling groups_inventory to load vars for managed_node1 18445 1726882536.40429: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.40433: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.40436: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.40439: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.40976: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000075 18445 1726882536.40980: WORKER PROCESS EXITING 18445 1726882536.40986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.41311: done with get_vars() 18445 1726882536.41323: done queuing things up, now waiting for results queue to drain 18445 1726882536.41325: results queue empty 18445 1726882536.41326: checking for any_errors_fatal 18445 1726882536.41328: done checking for any_errors_fatal 18445 1726882536.41329: checking for max_fail_percentage 18445 1726882536.41330: done checking for max_fail_percentage 18445 1726882536.41331: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.41331: done checking to see if all hosts have failed 18445 1726882536.41332: getting the remaining hosts for this loop 18445 1726882536.41333: done getting the remaining hosts for this loop 18445 1726882536.41335: getting the next task for host managed_node1 18445 1726882536.41339: done getting next task for host managed_node1 18445 1726882536.41340: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.41341: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.41344: getting variables 18445 1726882536.41345: in VariableManager get_vars() 18445 1726882536.41351: Calling all_inventory to load vars for managed_node1 18445 1726882536.41353: Calling groups_inventory to load vars for managed_node1 18445 1726882536.41355: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.41365: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.41367: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.41369: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.41493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.41664: done with get_vars() 18445 1726882536.41671: done getting variables 18445 1726882536.41709: in VariableManager get_vars() 18445 1726882536.41716: Calling all_inventory to load vars for managed_node1 18445 1726882536.41717: Calling groups_inventory to load vars for managed_node1 18445 1726882536.41719: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.41722: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.41724: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.41727: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.41859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.42034: done with get_vars() 18445 1726882536.42045: done queuing things up, now waiting for results queue to drain 18445 1726882536.42046: results queue empty 18445 1726882536.42050: checking for any_errors_fatal 18445 1726882536.42051: done checking for any_errors_fatal 18445 1726882536.42052: checking for max_fail_percentage 18445 1726882536.42053: done checking for max_fail_percentage 18445 1726882536.42054: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.42054: done checking to see if all hosts have failed 18445 1726882536.42055: getting the remaining hosts for this loop 18445 1726882536.42056: done getting the remaining hosts for this loop 18445 1726882536.42058: getting the next task for host managed_node1 18445 1726882536.42060: done getting next task for host managed_node1 18445 1726882536.42061: ^ task is: None 18445 1726882536.42063: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.42065: done queuing things up, now waiting for results queue to drain 18445 1726882536.42066: results queue empty 18445 1726882536.42067: checking for any_errors_fatal 18445 1726882536.42067: done checking for any_errors_fatal 18445 1726882536.42068: checking for max_fail_percentage 18445 1726882536.42069: done checking for max_fail_percentage 18445 1726882536.42070: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.42070: done checking to see if all hosts have failed 18445 1726882536.42071: getting the next task for host managed_node1 18445 1726882536.42075: done getting next task for host managed_node1 18445 1726882536.42075: ^ task is: None 18445 1726882536.42077: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.42105: in VariableManager get_vars() 18445 1726882536.42118: done with get_vars() 18445 1726882536.42123: in VariableManager get_vars() 18445 1726882536.42131: done with get_vars() 18445 1726882536.42135: variable 'omit' from source: magic vars 18445 1726882536.42162: in VariableManager get_vars() 18445 1726882536.42172: done with get_vars() 18445 1726882536.42191: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 18445 1726882536.42399: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18445 1726882536.42419: getting the remaining hosts for this loop 18445 1726882536.42420: done getting the remaining hosts for this loop 18445 1726882536.42423: getting the next task for host managed_node1 18445 1726882536.42425: done getting next task for host managed_node1 18445 1726882536.42427: ^ task is: TASK: Gathering Facts 18445 1726882536.42428: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.42430: getting variables 18445 1726882536.42431: in VariableManager get_vars() 18445 1726882536.42438: Calling all_inventory to load vars for managed_node1 18445 1726882536.42440: Calling groups_inventory to load vars for managed_node1 18445 1726882536.42442: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.42447: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.42449: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.42451: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.42586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.42769: done with get_vars() 18445 1726882536.42776: done getting variables 18445 1726882536.42811: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 21:35:36 -0400 (0:00:00.106) 0:00:08.365 ****** 18445 1726882536.42832: entering _queue_task() for managed_node1/gather_facts 18445 1726882536.43160: worker is 1 (out of 1 available) 18445 1726882536.43182: exiting _queue_task() for managed_node1/gather_facts 18445 1726882536.43192: done queuing things up, now waiting for results queue to drain 18445 1726882536.43193: waiting for pending results... 18445 1726882536.43428: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18445 1726882536.43514: in run() - task 0e448fcc-3ce9-f6eb-935c-0000000002e3 18445 1726882536.43536: variable 'ansible_search_path' from source: unknown 18445 1726882536.43575: calling self._execute() 18445 1726882536.43683: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882536.43695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882536.43709: variable 'omit' from source: magic vars 18445 1726882536.44122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882536.46577: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882536.46646: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882536.46930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882536.46971: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882536.47004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882536.47089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882536.47127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882536.47158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882536.47205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882536.47229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882536.47365: variable 'ansible_distribution' from source: facts 18445 1726882536.47388: variable 'ansible_distribution_major_version' from source: facts 18445 1726882536.47417: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882536.47425: when evaluation is False, skipping this task 18445 1726882536.47431: _execute() done 18445 1726882536.47437: dumping result to json 18445 1726882536.47448: done dumping result, returning 18445 1726882536.47458: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-f6eb-935c-0000000002e3] 18445 1726882536.47499: sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000002e3 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882536.47623: no more pending results, returning what we have 18445 1726882536.47627: results queue empty 18445 1726882536.47628: checking for any_errors_fatal 18445 1726882536.47629: done checking for any_errors_fatal 18445 1726882536.47630: checking for max_fail_percentage 18445 1726882536.47631: done checking for max_fail_percentage 18445 1726882536.47632: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.47633: done checking to see if all hosts have failed 18445 1726882536.47634: getting the remaining hosts for this loop 18445 1726882536.47635: done getting the remaining hosts for this loop 18445 1726882536.47639: getting the next task for host managed_node1 18445 1726882536.47645: done getting next task for host managed_node1 18445 1726882536.47647: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.47649: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.47654: getting variables 18445 1726882536.47655: in VariableManager get_vars() 18445 1726882536.47684: Calling all_inventory to load vars for managed_node1 18445 1726882536.47687: Calling groups_inventory to load vars for managed_node1 18445 1726882536.47690: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.47701: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.47704: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.47708: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.47908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.48130: done with get_vars() 18445 1726882536.48139: done getting variables 18445 1726882536.48207: in VariableManager get_vars() 18445 1726882536.48215: Calling all_inventory to load vars for managed_node1 18445 1726882536.48217: Calling groups_inventory to load vars for managed_node1 18445 1726882536.48219: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.48223: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.48225: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.48228: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.48359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.48541: done with get_vars() 18445 1726882536.48553: done queuing things up, now waiting for results queue to drain 18445 1726882536.48555: results queue empty 18445 1726882536.48556: checking for any_errors_fatal 18445 1726882536.48557: done checking for any_errors_fatal 18445 1726882536.48558: checking for max_fail_percentage 18445 1726882536.48559: done checking for max_fail_percentage 18445 1726882536.48559: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.48560: done checking to see if all hosts have failed 18445 1726882536.48561: getting the remaining hosts for this loop 18445 1726882536.48562: done getting the remaining hosts for this loop 18445 1726882536.48565: getting the next task for host managed_node1 18445 1726882536.48569: done getting next task for host managed_node1 18445 1726882536.48572: ^ task is: TASK: Verify network state restored to default 18445 1726882536.48573: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.48575: getting variables 18445 1726882536.48576: in VariableManager get_vars() 18445 1726882536.48584: Calling all_inventory to load vars for managed_node1 18445 1726882536.48586: Calling groups_inventory to load vars for managed_node1 18445 1726882536.48588: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.48593: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.48595: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.48603: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.48731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.49156: done with get_vars() 18445 1726882536.49165: done getting variables 18445 1726882536.49183: done sending task result for task 0e448fcc-3ce9-f6eb-935c-0000000002e3 18445 1726882536.49186: WORKER PROCESS EXITING TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 21:35:36 -0400 (0:00:00.064) 0:00:08.429 ****** 18445 1726882536.49238: entering _queue_task() for managed_node1/include_tasks 18445 1726882536.49449: worker is 1 (out of 1 available) 18445 1726882536.49460: exiting _queue_task() for managed_node1/include_tasks 18445 1726882536.49473: done queuing things up, now waiting for results queue to drain 18445 1726882536.49474: waiting for pending results... 18445 1726882536.49722: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 18445 1726882536.49814: in run() - task 0e448fcc-3ce9-f6eb-935c-000000000078 18445 1726882536.49832: variable 'ansible_search_path' from source: unknown 18445 1726882536.49874: calling self._execute() 18445 1726882536.49949: variable 'ansible_host' from source: host vars for 'managed_node1' 18445 1726882536.49960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18445 1726882536.49975: variable 'omit' from source: magic vars 18445 1726882536.50392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18445 1726882536.52917: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18445 1726882536.52985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18445 1726882536.53025: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18445 1726882536.53062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18445 1726882536.53097: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18445 1726882536.53172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18445 1726882536.53209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18445 1726882536.53239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18445 1726882536.53290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18445 1726882536.53309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18445 1726882536.53436: variable 'ansible_distribution' from source: facts 18445 1726882536.53446: variable 'ansible_distribution_major_version' from source: facts 18445 1726882536.53470: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 18445 1726882536.53478: when evaluation is False, skipping this task 18445 1726882536.53484: _execute() done 18445 1726882536.53490: dumping result to json 18445 1726882536.53496: done dumping result, returning 18445 1726882536.53508: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-f6eb-935c-000000000078] 18445 1726882536.53518: sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000078 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 18445 1726882536.53649: no more pending results, returning what we have 18445 1726882536.53653: results queue empty 18445 1726882536.53654: checking for any_errors_fatal 18445 1726882536.53655: done checking for any_errors_fatal 18445 1726882536.53656: checking for max_fail_percentage 18445 1726882536.53657: done checking for max_fail_percentage 18445 1726882536.53658: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.53659: done checking to see if all hosts have failed 18445 1726882536.53660: getting the remaining hosts for this loop 18445 1726882536.53662: done getting the remaining hosts for this loop 18445 1726882536.53668: getting the next task for host managed_node1 18445 1726882536.53676: done getting next task for host managed_node1 18445 1726882536.53678: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.53680: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.53684: getting variables 18445 1726882536.53686: in VariableManager get_vars() 18445 1726882536.53715: Calling all_inventory to load vars for managed_node1 18445 1726882536.53717: Calling groups_inventory to load vars for managed_node1 18445 1726882536.53721: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.53731: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.53734: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.53737: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.53915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.54111: done with get_vars() 18445 1726882536.54121: done getting variables 18445 1726882536.54190: in VariableManager get_vars() 18445 1726882536.54198: Calling all_inventory to load vars for managed_node1 18445 1726882536.54200: Calling groups_inventory to load vars for managed_node1 18445 1726882536.54202: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.54205: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.54207: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.54209: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.54555: done sending task result for task 0e448fcc-3ce9-f6eb-935c-000000000078 18445 1726882536.54559: WORKER PROCESS EXITING 18445 1726882536.54578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.54782: done with get_vars() 18445 1726882536.54794: done queuing things up, now waiting for results queue to drain 18445 1726882536.54796: results queue empty 18445 1726882536.54796: checking for any_errors_fatal 18445 1726882536.54798: done checking for any_errors_fatal 18445 1726882536.54799: checking for max_fail_percentage 18445 1726882536.54800: done checking for max_fail_percentage 18445 1726882536.54801: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.54802: done checking to see if all hosts have failed 18445 1726882536.54802: getting the remaining hosts for this loop 18445 1726882536.54803: done getting the remaining hosts for this loop 18445 1726882536.54805: getting the next task for host managed_node1 18445 1726882536.54809: done getting next task for host managed_node1 18445 1726882536.54810: ^ task is: TASK: meta (flush_handlers) 18445 1726882536.54811: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.54814: getting variables 18445 1726882536.54815: in VariableManager get_vars() 18445 1726882536.54821: Calling all_inventory to load vars for managed_node1 18445 1726882536.54823: Calling groups_inventory to load vars for managed_node1 18445 1726882536.54825: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.54834: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.54837: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.54840: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.54973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.55148: done with get_vars() 18445 1726882536.55156: done getting variables 18445 1726882536.55197: in VariableManager get_vars() 18445 1726882536.55205: Calling all_inventory to load vars for managed_node1 18445 1726882536.55208: Calling groups_inventory to load vars for managed_node1 18445 1726882536.55210: Calling all_plugins_inventory to load vars for managed_node1 18445 1726882536.55214: Calling all_plugins_play to load vars for managed_node1 18445 1726882536.55216: Calling groups_plugins_inventory to load vars for managed_node1 18445 1726882536.55219: Calling groups_plugins_play to load vars for managed_node1 18445 1726882536.55343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18445 1726882536.55518: done with get_vars() 18445 1726882536.55529: done queuing things up, now waiting for results queue to drain 18445 1726882536.55531: results queue empty 18445 1726882536.55532: checking for any_errors_fatal 18445 1726882536.55533: done checking for any_errors_fatal 18445 1726882536.55534: checking for max_fail_percentage 18445 1726882536.55534: done checking for max_fail_percentage 18445 1726882536.55535: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.55536: done checking to see if all hosts have failed 18445 1726882536.55536: getting the remaining hosts for this loop 18445 1726882536.55537: done getting the remaining hosts for this loop 18445 1726882536.55540: getting the next task for host managed_node1 18445 1726882536.55542: done getting next task for host managed_node1 18445 1726882536.55543: ^ task is: None 18445 1726882536.55544: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18445 1726882536.55545: done queuing things up, now waiting for results queue to drain 18445 1726882536.55546: results queue empty 18445 1726882536.55547: checking for any_errors_fatal 18445 1726882536.55548: done checking for any_errors_fatal 18445 1726882536.55548: checking for max_fail_percentage 18445 1726882536.55549: done checking for max_fail_percentage 18445 1726882536.55550: checking to see if all hosts have failed and the running result is not ok 18445 1726882536.55551: done checking to see if all hosts have failed 18445 1726882536.55552: getting the next task for host managed_node1 18445 1726882536.55554: done getting next task for host managed_node1 18445 1726882536.55555: ^ task is: None 18445 1726882536.55556: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=93 rescued=0 ignored=0 Friday 20 September 2024 21:35:36 -0400 (0:00:00.063) 0:00:08.493 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.62s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_initscripts.yml:5 Check if system is ostree ----------------------------------------------- 0.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather the minimum subset of ansible_facts required by the network role test --- 0.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.11s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Include the task 'delete_interface.yml' --------------------------------- 0.11s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Include the task 'assert_device_absent.yml' ----------------------------- 0.11s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.10s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Include the task 'assert_output_in_stderr_without_warnings.yml' --------- 0.10s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.10s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 0.10s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Include the task 'assert_profile_absent.yml' ---------------------------- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Show debug messages for the network_state --- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 fedora.linux_system_roles.network : Ensure initscripts network file dependency is present --- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Gathering Facts --------------------------------------------------------- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 --- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 fedora.linux_system_roles.network : Configure networking state ---------- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 fedora.linux_system_roles.network : Configure networking state ---------- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 fedora.linux_system_roles.network : Enable network service -------------- 0.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 18445 1726882536.55638: RUNNING CLEANUP