28285 1727204258.67553: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 28285 1727204258.68422: Added group all to inventory 28285 1727204258.68425: Added group ungrouped to inventory 28285 1727204258.68430: Group all now contains ungrouped 28285 1727204258.68433: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 28285 1727204258.80358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 28285 1727204258.80402: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 28285 1727204258.80420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 28285 1727204258.80465: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 28285 1727204258.80513: Loaded config def from plugin (inventory/script) 28285 1727204258.80515: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 28285 1727204258.80544: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 28285 1727204258.80605: Loaded config def from plugin (inventory/yaml) 28285 1727204258.80607: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 28285 1727204258.80672: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 28285 1727204258.80954: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 28285 1727204258.80957: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 28285 1727204258.80960: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 28285 1727204258.80966: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 28285 1727204258.80970: Loading data from /tmp/network-M6W/inventory-5vW.yml 28285 1727204258.81012: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 28285 1727204258.81058: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 28285 1727204258.81090: Loading data from /tmp/network-M6W/inventory-5vW.yml 28285 1727204258.81142: group all already in inventory 28285 1727204258.81150: set inventory_file for managed-node1 28285 1727204258.81153: set inventory_dir for managed-node1 28285 1727204258.81154: Added host managed-node1 to inventory 28285 1727204258.81155: Added host managed-node1 to group all 28285 1727204258.81156: set ansible_host for managed-node1 28285 1727204258.81156: set ansible_ssh_extra_args for managed-node1 28285 1727204258.81159: set inventory_file for managed-node2 28285 1727204258.81160: set inventory_dir for managed-node2 28285 1727204258.81161: Added host managed-node2 to inventory 28285 1727204258.81162: Added host managed-node2 to group all 28285 1727204258.81162: set ansible_host for managed-node2 28285 1727204258.81163: set ansible_ssh_extra_args for managed-node2 28285 1727204258.81166: set inventory_file for managed-node3 28285 1727204258.81168: set inventory_dir for managed-node3 28285 1727204258.81168: Added host managed-node3 to inventory 28285 1727204258.81169: Added host managed-node3 to group all 28285 1727204258.81169: set ansible_host for managed-node3 28285 1727204258.81170: set ansible_ssh_extra_args for managed-node3 28285 1727204258.81172: Reconcile groups and hosts in inventory. 28285 1727204258.81174: Group ungrouped now contains managed-node1 28285 1727204258.81176: Group ungrouped now contains managed-node2 28285 1727204258.81178: Group ungrouped now contains managed-node3 28285 1727204258.81230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 28285 1727204258.81314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 28285 1727204258.81344: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 28285 1727204258.81366: Loaded config def from plugin (vars/host_group_vars) 28285 1727204258.81368: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 28285 1727204258.81373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 28285 1727204258.81378: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 28285 1727204258.81408: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 28285 1727204258.81653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204258.81726: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 28285 1727204258.81751: Loaded config def from plugin (connection/local) 28285 1727204258.81754: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 28285 1727204258.82097: Loaded config def from plugin (connection/paramiko_ssh) 28285 1727204258.82099: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 28285 1727204258.82715: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28285 1727204258.82738: Loaded config def from plugin (connection/psrp) 28285 1727204258.82740: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 28285 1727204258.83161: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28285 1727204258.83186: Loaded config def from plugin (connection/ssh) 28285 1727204258.83188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 28285 1727204258.83403: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28285 1727204258.83425: Loaded config def from plugin (connection/winrm) 28285 1727204258.83427: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 28285 1727204258.83447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 28285 1727204258.83493: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 28285 1727204258.83530: Loaded config def from plugin (shell/cmd) 28285 1727204258.83531: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 28285 1727204258.83548: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 28285 1727204258.83589: Loaded config def from plugin (shell/powershell) 28285 1727204258.83590: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 28285 1727204258.83625: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 28285 1727204258.83730: Loaded config def from plugin (shell/sh) 28285 1727204258.83732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 28285 1727204258.83755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 28285 1727204258.83830: Loaded config def from plugin (become/runas) 28285 1727204258.83832: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 28285 1727204258.83965: Loaded config def from plugin (become/su) 28285 1727204258.83967: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 28285 1727204258.84066: Loaded config def from plugin (become/sudo) 28285 1727204258.84068: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 28285 1727204258.84090: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethtool_features_initscripts.yml 28285 1727204258.84311: in VariableManager get_vars() 28285 1727204258.84330: done with get_vars() 28285 1727204258.84423: trying /usr/local/lib/python3.12/site-packages/ansible/modules 28285 1727204258.86816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 28285 1727204258.86903: in VariableManager get_vars() 28285 1727204258.86906: done with get_vars() 28285 1727204258.86908: variable 'playbook_dir' from source: magic vars 28285 1727204258.86909: variable 'ansible_playbook_python' from source: magic vars 28285 1727204258.86909: variable 'ansible_config_file' from source: magic vars 28285 1727204258.86910: variable 'groups' from source: magic vars 28285 1727204258.86911: variable 'omit' from source: magic vars 28285 1727204258.86911: variable 'ansible_version' from source: magic vars 28285 1727204258.86912: variable 'ansible_check_mode' from source: magic vars 28285 1727204258.86912: variable 'ansible_diff_mode' from source: magic vars 28285 1727204258.86913: variable 'ansible_forks' from source: magic vars 28285 1727204258.86913: variable 'ansible_inventory_sources' from source: magic vars 28285 1727204258.86914: variable 'ansible_skip_tags' from source: magic vars 28285 1727204258.86914: variable 'ansible_limit' from source: magic vars 28285 1727204258.86915: variable 'ansible_run_tags' from source: magic vars 28285 1727204258.86915: variable 'ansible_verbosity' from source: magic vars 28285 1727204258.86938: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml 28285 1727204258.87744: in VariableManager get_vars() 28285 1727204258.87758: done with get_vars() 28285 1727204258.87784: in VariableManager get_vars() 28285 1727204258.87792: done with get_vars() 28285 1727204258.87813: in VariableManager get_vars() 28285 1727204258.87820: done with get_vars() 28285 1727204258.88021: in VariableManager get_vars() 28285 1727204258.88032: done with get_vars() 28285 1727204258.88035: variable 'omit' from source: magic vars 28285 1727204258.88051: variable 'omit' from source: magic vars 28285 1727204258.88075: in VariableManager get_vars() 28285 1727204258.88082: done with get_vars() 28285 1727204258.88112: in VariableManager get_vars() 28285 1727204258.88121: done with get_vars() 28285 1727204258.88144: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28285 1727204258.88288: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28285 1727204258.88368: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28285 1727204258.89007: in VariableManager get_vars() 28285 1727204258.89026: done with get_vars() 28285 1727204258.89445: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 28285 1727204258.89592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204258.90906: in VariableManager get_vars() 28285 1727204258.90920: done with get_vars() 28285 1727204258.90933: variable 'omit' from source: magic vars 28285 1727204258.90940: variable 'omit' from source: magic vars 28285 1727204258.90963: in VariableManager get_vars() 28285 1727204258.90974: done with get_vars() 28285 1727204258.90987: in VariableManager get_vars() 28285 1727204258.90996: done with get_vars() 28285 1727204258.91017: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28285 1727204258.91085: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28285 1727204258.91132: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28285 1727204258.91368: in VariableManager get_vars() 28285 1727204258.91382: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204258.93436: in VariableManager get_vars() 28285 1727204258.93460: done with get_vars() 28285 1727204258.93467: variable 'omit' from source: magic vars 28285 1727204258.93483: variable 'omit' from source: magic vars 28285 1727204258.93514: in VariableManager get_vars() 28285 1727204258.93532: done with get_vars() 28285 1727204258.93555: in VariableManager get_vars() 28285 1727204258.93574: done with get_vars() 28285 1727204258.93604: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28285 1727204258.93752: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28285 1727204258.93830: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28285 1727204258.94318: in VariableManager get_vars() 28285 1727204258.94342: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204258.97866: in VariableManager get_vars() 28285 1727204258.97910: done with get_vars() 28285 1727204258.97917: variable 'omit' from source: magic vars 28285 1727204258.97929: variable 'omit' from source: magic vars 28285 1727204258.97979: in VariableManager get_vars() 28285 1727204258.98015: done with get_vars() 28285 1727204258.98045: in VariableManager get_vars() 28285 1727204258.98068: done with get_vars() 28285 1727204259.01133: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28285 1727204259.01273: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28285 1727204259.01370: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28285 1727204259.01933: in VariableManager get_vars() 28285 1727204259.01969: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204259.05814: in VariableManager get_vars() 28285 1727204259.05851: done with get_vars() 28285 1727204259.05867: variable 'omit' from source: magic vars 28285 1727204259.05895: variable 'omit' from source: magic vars 28285 1727204259.05932: in VariableManager get_vars() 28285 1727204259.05957: done with get_vars() 28285 1727204259.05980: in VariableManager get_vars() 28285 1727204259.06005: done with get_vars() 28285 1727204259.06036: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28285 1727204259.06186: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28285 1727204259.07273: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28285 1727204259.07699: in VariableManager get_vars() 28285 1727204259.07731: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204259.11929: in VariableManager get_vars() 28285 1727204259.12084: done with get_vars() 28285 1727204259.12125: in VariableManager get_vars() 28285 1727204259.12272: done with get_vars() 28285 1727204259.12340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 28285 1727204259.12358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 28285 1727204259.13183: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 28285 1727204259.13712: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 28285 1727204259.13715: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 28285 1727204259.13751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 28285 1727204259.13779: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 28285 1727204259.14239: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 28285 1727204259.14309: Loaded config def from plugin (callback/default) 28285 1727204259.14312: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28285 1727204259.18070: Loaded config def from plugin (callback/junit) 28285 1727204259.18074: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28285 1727204259.18129: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 28285 1727204259.18507: Loaded config def from plugin (callback/minimal) 28285 1727204259.18510: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28285 1727204259.18558: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28285 1727204259.18625: Loaded config def from plugin (callback/tree) 28285 1727204259.18628: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 28285 1727204259.19284: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 28285 1727204259.19288: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethtool_features_initscripts.yml ******************************* 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethtool_features_initscripts.yml 28285 1727204259.19319: in VariableManager get_vars() 28285 1727204259.19335: done with get_vars() 28285 1727204259.19342: in VariableManager get_vars() 28285 1727204259.19355: done with get_vars() 28285 1727204259.19359: variable 'omit' from source: magic vars 28285 1727204259.19403: in VariableManager get_vars() 28285 1727204259.19419: done with get_vars() 28285 1727204259.19442: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethtool_features.yml' with initscripts as provider] *** 28285 1727204259.20883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 28285 1727204259.20971: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 28285 1727204259.21006: getting the remaining hosts for this loop 28285 1727204259.21008: done getting the remaining hosts for this loop 28285 1727204259.21013: getting the next task for host managed-node1 28285 1727204259.21018: done getting next task for host managed-node1 28285 1727204259.21019: ^ task is: TASK: Gathering Facts 28285 1727204259.21021: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204259.21023: getting variables 28285 1727204259.21024: in VariableManager get_vars() 28285 1727204259.21035: Calling all_inventory to load vars for managed-node1 28285 1727204259.21037: Calling groups_inventory to load vars for managed-node1 28285 1727204259.21040: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204259.21053: Calling all_plugins_play to load vars for managed-node1 28285 1727204259.21067: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204259.21072: Calling groups_plugins_play to load vars for managed-node1 28285 1727204259.21107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204259.21160: done with get_vars() 28285 1727204259.21169: done getting variables 28285 1727204259.21234: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethtool_features_initscripts.yml:5 Tuesday 24 September 2024 14:57:39 -0400 (0:00:00.022) 0:00:00.022 ***** 28285 1727204259.21284: entering _queue_task() for managed-node1/gather_facts 28285 1727204259.21285: Creating lock for gather_facts 28285 1727204259.21632: worker is 1 (out of 1 available) 28285 1727204259.21644: exiting _queue_task() for managed-node1/gather_facts 28285 1727204259.21657: done queuing things up, now waiting for results queue to drain 28285 1727204259.21659: waiting for pending results... 28285 1727204259.22890: running TaskExecutor() for managed-node1/TASK: Gathering Facts 28285 1727204259.23087: in run() - task 0affcd87-79f5-57a1-d976-0000000001aa 28285 1727204259.24088: variable 'ansible_search_path' from source: unknown 28285 1727204259.24311: calling self._execute() 28285 1727204259.24383: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204259.24394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204259.24408: variable 'omit' from source: magic vars 28285 1727204259.24918: variable 'omit' from source: magic vars 28285 1727204259.24958: variable 'omit' from source: magic vars 28285 1727204259.25002: variable 'omit' from source: magic vars 28285 1727204259.25080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28285 1727204259.25121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28285 1727204259.25147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28285 1727204259.25389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204259.25406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204259.25437: variable 'inventory_hostname' from source: host vars for 'managed-node1' 28285 1727204259.25578: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204259.25587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204259.25689: Set connection var ansible_shell_executable to /bin/sh 28285 1727204259.25717: Set connection var ansible_pipelining to False 28285 1727204259.25731: Set connection var ansible_timeout to 10 28285 1727204259.25775: Set connection var ansible_shell_type to sh 28285 1727204259.25785: Set connection var ansible_connection to ssh 28285 1727204259.25794: Set connection var ansible_module_compression to ZIP_DEFLATED 28285 1727204259.25820: variable 'ansible_shell_executable' from source: unknown 28285 1727204259.25983: variable 'ansible_connection' from source: unknown 28285 1727204259.25991: variable 'ansible_module_compression' from source: unknown 28285 1727204259.25997: variable 'ansible_shell_type' from source: unknown 28285 1727204259.26004: variable 'ansible_shell_executable' from source: unknown 28285 1727204259.26010: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204259.26017: variable 'ansible_pipelining' from source: unknown 28285 1727204259.26023: variable 'ansible_timeout' from source: unknown 28285 1727204259.26031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204259.26218: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28285 1727204259.26670: variable 'omit' from source: magic vars 28285 1727204259.26681: starting attempt loop 28285 1727204259.26687: running the handler 28285 1727204259.26706: variable 'ansible_facts' from source: unknown 28285 1727204259.26728: _low_level_execute_command(): starting 28285 1727204259.26741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28285 1727204259.28677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204259.28681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204259.28830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204259.28833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204259.28836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204259.29001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204259.29014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204259.29098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204259.30780: stdout chunk (state=3): >>>/root <<< 28285 1727204259.30881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204259.30969: stderr chunk (state=3): >>><<< 28285 1727204259.30972: stdout chunk (state=3): >>><<< 28285 1727204259.31084: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28285 1727204259.31088: _low_level_execute_command(): starting 28285 1727204259.31092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838 `" && echo ansible-tmp-1727204259.3099487-28387-144073599747838="` echo /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838 `" ) && sleep 0' 28285 1727204259.32519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204259.32522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204259.32621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 28285 1727204259.32625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204259.32636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204259.32798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204259.32801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204259.32804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204259.32935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204259.34835: stdout chunk (state=3): >>>ansible-tmp-1727204259.3099487-28387-144073599747838=/root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838 <<< 28285 1727204259.34953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204259.35034: stderr chunk (state=3): >>><<< 28285 1727204259.35037: stdout chunk (state=3): >>><<< 28285 1727204259.35371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204259.3099487-28387-144073599747838=/root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28285 1727204259.35375: variable 'ansible_module_compression' from source: unknown 28285 1727204259.35377: ANSIBALLZ: Using generic lock for ansible.legacy.setup 28285 1727204259.35380: ANSIBALLZ: Acquiring lock 28285 1727204259.35382: ANSIBALLZ: Lock acquired: 140647066829648 28285 1727204259.35384: ANSIBALLZ: Creating module 28285 1727204259.99443: ANSIBALLZ: Writing module into payload 28285 1727204259.99626: ANSIBALLZ: Writing module 28285 1727204259.99659: ANSIBALLZ: Renaming module 28285 1727204259.99662: ANSIBALLZ: Done creating module 28285 1727204259.99712: variable 'ansible_facts' from source: unknown 28285 1727204259.99715: variable 'inventory_hostname' from source: host vars for 'managed-node1' 28285 1727204259.99726: _low_level_execute_command(): starting 28285 1727204259.99732: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 28285 1727204260.00413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204260.00423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.00435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.00450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.00496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.00504: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204260.00514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.00527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204260.00534: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204260.00541: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204260.00549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.00565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.00583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.00591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.00600: stderr chunk (state=3): >>>debug2: match found <<< 28285 1727204260.00607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.00686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204260.00703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204260.00715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204260.00812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204260.02490: stdout chunk (state=3): >>>PLATFORM <<< 28285 1727204260.02601: stdout chunk (state=3): >>>Linux <<< 28285 1727204260.02605: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 28285 1727204260.02745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204260.02840: stderr chunk (state=3): >>><<< 28285 1727204260.02853: stdout chunk (state=3): >>><<< 28285 1727204260.03014: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28285 1727204260.03020 [managed-node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 28285 1727204260.03022: _low_level_execute_command(): starting 28285 1727204260.03025: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 28285 1727204260.03090: Sending initial data 28285 1727204260.03093: Sent initial data (1181 bytes) 28285 1727204260.03996: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204260.04013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.04039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.04058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.04170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.04199: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204260.04216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.04237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204260.04249: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204260.04261: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204260.04291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.04305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.04321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.04332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.04343: stderr chunk (state=3): >>>debug2: match found <<< 28285 1727204260.04357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.04442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204260.04467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204260.04482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204260.04569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204260.08382: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 28285 1727204260.08975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204260.08979: stdout chunk (state=3): >>><<< 28285 1727204260.08981: stderr chunk (state=3): >>><<< 28285 1727204260.08984: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28285 1727204260.08986: variable 'ansible_facts' from source: unknown 28285 1727204260.08988: variable 'ansible_facts' from source: unknown 28285 1727204260.08990: variable 'ansible_module_compression' from source: unknown 28285 1727204260.09083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28285ojofnqq2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28285 1727204260.09087: variable 'ansible_facts' from source: unknown 28285 1727204260.09233: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838/AnsiballZ_setup.py 28285 1727204260.09880: Sending initial data 28285 1727204260.09883: Sent initial data (154 bytes) 28285 1727204260.11824: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.11828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.11860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.11863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.11867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.11935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204260.11938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204260.12714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204260.12776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204260.14537: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28285 1727204260.14585: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28285 1727204260.14645: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28285ojofnqq2/tmpkq2d3lny /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838/AnsiballZ_setup.py <<< 28285 1727204260.14689: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28285 1727204260.17819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204260.17823: stderr chunk (state=3): >>><<< 28285 1727204260.17830: stdout chunk (state=3): >>><<< 28285 1727204260.17855: done transferring module to remote 28285 1727204260.17872: _low_level_execute_command(): starting 28285 1727204260.17876: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838/ /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838/AnsiballZ_setup.py && sleep 0' 28285 1727204260.19646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204260.19653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.19665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.19680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.19719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.19726: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204260.19736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.19752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204260.19757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204260.19765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204260.19774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.19783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.19794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.19802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.19808: stderr chunk (state=3): >>>debug2: match found <<< 28285 1727204260.19817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.20096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204260.20116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204260.20124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204260.20445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204260.22287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204260.22291: stdout chunk (state=3): >>><<< 28285 1727204260.22294: stderr chunk (state=3): >>><<< 28285 1727204260.22321: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28285 1727204260.22325: _low_level_execute_command(): starting 28285 1727204260.22327: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838/AnsiballZ_setup.py && sleep 0' 28285 1727204260.24156: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204260.24179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.24195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.24215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.24261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.24277: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204260.24293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.24311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204260.24323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204260.24334: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204260.24345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204260.24359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204260.24377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204260.24388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204260.24398: stderr chunk (state=3): >>>debug2: match found <<< 28285 1727204260.24411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204260.24593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204260.24610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204260.24625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204260.24861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204260.26802: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 28285 1727204260.26870: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 28285 1727204260.26900: stdout chunk (state=3): >>>import 'posix' # <<< 28285 1727204260.26929: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28285 1727204260.26973: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 28285 1727204260.27031: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.27046: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 28285 1727204260.27076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 28285 1727204260.27102: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4491dc0> <<< 28285 1727204260.27146: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 28285 1727204260.27165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c44363a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4491b20> <<< 28285 1727204260.27201: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4491ac0> <<< 28285 1727204260.27236: stdout chunk (state=3): >>>import '_signal' # <<< 28285 1727204260.27263: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 28285 1727204260.27297: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436490> <<< 28285 1727204260.27341: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # <<< 28285 1727204260.27345: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436940> <<< 28285 1727204260.27358: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436670> <<< 28285 1727204260.27388: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 28285 1727204260.27431: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 28285 1727204260.27471: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 28285 1727204260.27474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 28285 1727204260.27508: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41cf190> <<< 28285 1727204260.27511: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 28285 1727204260.27537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 28285 1727204260.27602: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41cf220> <<< 28285 1727204260.27632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 28285 1727204260.27680: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41cf940> <<< 28285 1727204260.27727: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c444e880> <<< 28285 1727204260.27741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41c8d90> <<< 28285 1727204260.27806: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 28285 1727204260.27809: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41f2d90> <<< 28285 1727204260.27879: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436970> <<< 28285 1727204260.27893: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28285 1727204260.28220: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 28285 1727204260.28249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 28285 1727204260.28298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 28285 1727204260.28437: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c416df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41740a0> <<< 28285 1727204260.28471: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 28285 1727204260.28516: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 28285 1727204260.28540: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c416e6a0> <<< 28285 1727204260.28585: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c416d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 28285 1727204260.28648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 28285 1727204260.28676: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 28285 1727204260.28704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.28736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 28285 1727204260.28779: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204260.28795: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c4055e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4055940> import 'itertools' # <<< 28285 1727204260.28814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4055f40> <<< 28285 1727204260.28846: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 28285 1727204260.28879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 28285 1727204260.28916: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4055d90> <<< 28285 1727204260.28920: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066100> import '_collections' # <<< 28285 1727204260.28979: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4149dc0> <<< 28285 1727204260.28983: stdout chunk (state=3): >>>import '_functools' # <<< 28285 1727204260.28997: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41426a0> <<< 28285 1727204260.29074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4155700> <<< 28285 1727204260.29079: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4175eb0> <<< 28285 1727204260.29108: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 28285 1727204260.29125: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c4066d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41492e0> <<< 28285 1727204260.29162: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c4155310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c417ba60> <<< 28285 1727204260.29208: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 28285 1727204260.29212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 28285 1727204260.29258: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.29291: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 28285 1727204260.29295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066e20> <<< 28285 1727204260.29320: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066d90> <<< 28285 1727204260.29355: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 28285 1727204260.29373: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 28285 1727204260.29387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 28285 1727204260.29432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 28285 1727204260.29477: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4039400> <<< 28285 1727204260.29499: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 28285 1727204260.29533: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c40394f0> <<< 28285 1727204260.29651: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c406ef70> <<< 28285 1727204260.29701: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4068ac0> <<< 28285 1727204260.29720: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4068490> <<< 28285 1727204260.29733: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 28285 1727204260.29809: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 28285 1727204260.29813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 28285 1727204260.29826: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f6d250> <<< 28285 1727204260.29856: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4024550> <<< 28285 1727204260.29922: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4068f40> <<< 28285 1727204260.29946: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c417b0d0> <<< 28285 1727204260.29950: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 28285 1727204260.29990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 28285 1727204260.30007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f7fb80> import 'errno' # <<< 28285 1727204260.30048: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f7feb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 28285 1727204260.30085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 28285 1727204260.30105: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f907c0> <<< 28285 1727204260.30133: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 28285 1727204260.30156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 28285 1727204260.30183: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f90d00> <<< 28285 1727204260.30217: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f1e430> <<< 28285 1727204260.30230: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f7ffa0> <<< 28285 1727204260.30261: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 28285 1727204260.30310: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f2f310> <<< 28285 1727204260.30323: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f90640> import 'pwd' # <<< 28285 1727204260.30352: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f2f3d0> <<< 28285 1727204260.30388: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066a60> <<< 28285 1727204260.30417: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 28285 1727204260.30441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 28285 1727204260.30461: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 28285 1727204260.30508: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4b730> <<< 28285 1727204260.30520: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 28285 1727204260.30554: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4ba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f4b7f0> <<< 28285 1727204260.30583: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4b8e0> <<< 28285 1727204260.30611: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 28285 1727204260.30805: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4bd30> <<< 28285 1727204260.30846: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f55280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f4b970> <<< 28285 1727204260.30869: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f3eac0> <<< 28285 1727204260.30892: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066640> <<< 28285 1727204260.30914: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 28285 1727204260.30975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 28285 1727204260.31006: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f4bb20> <<< 28285 1727204260.31171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f35c3e6a700> <<< 28285 1727204260.31415: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 28285 1727204260.31513: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.31547: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 28285 1727204260.31580: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28285 1727204260.31596: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 28285 1727204260.32816: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.33784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9850> <<< 28285 1727204260.33891: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.33905: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3da9160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9280> <<< 28285 1727204260.33941: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 28285 1727204260.33945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 28285 1727204260.33997: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da94f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9dc0> import 'atexit' # <<< 28285 1727204260.34039: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3da9580> <<< 28285 1727204260.34043: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 28285 1727204260.34075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 28285 1727204260.34125: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9100> <<< 28285 1727204260.34138: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 28285 1727204260.34170: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 28285 1727204260.34196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 28285 1727204260.34218: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 28285 1727204260.34286: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d69f70> <<< 28285 1727204260.34345: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3748370> <<< 28285 1727204260.34381: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3748070> <<< 28285 1727204260.34404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 28285 1727204260.34408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 28285 1727204260.34444: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3748cd0> <<< 28285 1727204260.34447: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d91dc0> <<< 28285 1727204260.34608: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d913a0> <<< 28285 1727204260.34622: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 28285 1727204260.34655: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d91f40> <<< 28285 1727204260.34702: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 28285 1727204260.34714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 28285 1727204260.34736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 28285 1727204260.34776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 28285 1727204260.34789: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3ddef40> <<< 28285 1727204260.34862: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dacd60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dac430> <<< 28285 1727204260.34867: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d5caf0> <<< 28285 1727204260.34925: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3dac550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dac580> <<< 28285 1727204260.34960: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 28285 1727204260.34963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 28285 1727204260.34977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 28285 1727204260.35012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 28285 1727204260.35075: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204260.35105: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37b6fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3df0280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 28285 1727204260.35122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 28285 1727204260.35173: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37b3820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3df0400> <<< 28285 1727204260.35202: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 28285 1727204260.35262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.35286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 28285 1727204260.35289: stdout chunk (state=3): >>>import '_string' # <<< 28285 1727204260.35325: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3df0c40> <<< 28285 1727204260.35459: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37b37c0> <<< 28285 1727204260.35543: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3d891c0> <<< 28285 1727204260.35582: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3df09d0> <<< 28285 1727204260.35625: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3df0550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3de9940> <<< 28285 1727204260.35678: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 28285 1727204260.35693: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 28285 1727204260.35738: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37a8910> <<< 28285 1727204260.35927: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37c5dc0> <<< 28285 1727204260.35950: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37b2550> <<< 28285 1727204260.35972: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37a8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37b2970> # zipimport: zlib available <<< 28285 1727204260.36003: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 28285 1727204260.36021: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.36085: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.36181: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.36216: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 28285 1727204260.36231: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 28285 1727204260.36234: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.36320: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.36420: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.36863: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.37339: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 28285 1727204260.37344: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 28285 1727204260.37378: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.37430: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37ee7f0> <<< 28285 1727204260.37516: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 28285 1727204260.37519: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37f38b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c334f940> <<< 28285 1727204260.37575: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 28285 1727204260.37600: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.37617: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 28285 1727204260.37623: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.37734: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.37869: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 28285 1727204260.37897: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d67730> # zipimport: zlib available <<< 28285 1727204260.38292: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.38662: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.38720: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.38789: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 28285 1727204260.38796: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.38823: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.38858: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 28285 1727204260.38924: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.38998: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 28285 1727204260.39021: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28285 1727204260.39037: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 28285 1727204260.39073: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39109: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 28285 1727204260.39114: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39294: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39771: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 28285 1727204260.39775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 28285 1727204260.39781: stdout chunk (state=3): >>>import '_ast' # <<< 28285 1727204260.39783: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dab2e0> <<< 28285 1727204260.39786: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39788: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39790: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 28285 1727204260.39792: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 28285 1727204260.39794: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39796: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39831: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 28285 1727204260.39836: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39878: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.39912: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40003: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40065: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 28285 1727204260.40092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.40158: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37e5880> <<< 28285 1727204260.40239: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c31e3ac0> <<< 28285 1727204260.40273: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 28285 1727204260.40280: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 28285 1727204260.40338: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40387: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40414: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40471: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 28285 1727204260.40485: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 28285 1727204260.40534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 28285 1727204260.40541: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 28285 1727204260.40573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 28285 1727204260.40637: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37f6910> <<< 28285 1727204260.40685: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d7b970> <<< 28285 1727204260.40740: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d65850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 28285 1727204260.40746: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40776: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40799: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 28285 1727204260.40884: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 28285 1727204260.40898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28285 1727204260.40911: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 28285 1727204260.40917: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.40974: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41025: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41046: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41059: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41100: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41166: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41176: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41193: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 28285 1727204260.41209: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41277: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41342: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41353: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41393: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 28285 1727204260.41544: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41681: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41717: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.41766: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204260.41788: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 28285 1727204260.41813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 28285 1727204260.41829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 28285 1727204260.41860: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c30d0c70> <<< 28285 1727204260.41883: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 28285 1727204260.41910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 28285 1727204260.41931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 28285 1727204260.41967: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 28285 1727204260.41981: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3330a30> <<< 28285 1727204260.42016: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c33309a0> <<< 28285 1727204260.42088: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c337cb20> <<< 28285 1727204260.42093: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c337c550> <<< 28285 1727204260.42124: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c33642e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3364970> <<< 28285 1727204260.42152: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 28285 1727204260.42166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 28285 1727204260.42191: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 28285 1727204260.42234: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c33152b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3315a00> <<< 28285 1727204260.42268: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 28285 1727204260.42299: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3315940> <<< 28285 1727204260.42313: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 28285 1727204260.42336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 28285 1727204260.42377: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204260.42383: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c31310d0> <<< 28285 1727204260.42408: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37e13a0> <<< 28285 1727204260.42435: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3364670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 28285 1727204260.42441: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 28285 1727204260.42465: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42493: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 28285 1727204260.42499: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42568: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42622: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 28285 1727204260.42656: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42728: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 28285 1727204260.42735: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42759: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42791: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 28285 1727204260.42852: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42890: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 28285 1727204260.42896: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42928: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.42981: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 28285 1727204260.43035: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.43090: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.43143: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.43193: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 28285 1727204260.43581: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.43938: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 28285 1727204260.43945: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.43998: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44045: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44095: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44115: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 28285 1727204260.44143: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44178: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 28285 1727204260.44191: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44229: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44286: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 28285 1727204260.44323: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44369: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 28285 1727204260.44372: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44401: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44419: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 28285 1727204260.44480: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44565: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 28285 1727204260.44590: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3020eb0> <<< 28285 1727204260.44623: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 28285 1727204260.44626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 28285 1727204260.44783: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c30209d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 28285 1727204260.44842: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.44899: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 28285 1727204260.44984: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.45069: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 28285 1727204260.45121: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.45188: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 28285 1727204260.45230: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.45285: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 28285 1727204260.45298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 28285 1727204260.45443: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c308cbb0> <<< 28285 1727204260.45683: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3049a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 28285 1727204260.45731: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.45794: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 28285 1727204260.45797: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.45857: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.45922: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46024: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46154: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 28285 1727204260.46197: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46242: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 28285 1727204260.46245: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46275: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 28285 1727204260.46328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 28285 1727204260.46383: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204260.46388: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3093040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c30936d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 28285 1727204260.46408: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 28285 1727204260.46424: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46467: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204260.46471: stdout chunk (state=3): >>> <<< 28285 1727204260.46512: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 28285 1727204260.46515: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46635: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46777: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 28285 1727204260.46781: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46861: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46942: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.46981: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.47032: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 28285 1727204260.47035: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.47114: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.47129: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.47239: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.47362: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 28285 1727204260.47472: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.47575: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 28285 1727204260.47610: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.47637: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.48073: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.48501: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 28285 1727204260.48603: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.48675: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 28285 1727204260.48756: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.48850: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 28285 1727204260.48973: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49126: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 28285 1727204260.49148: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 28285 1727204260.49151: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49184: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49218: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 28285 1727204260.49232: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49309: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49396: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49561: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49741: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 28285 1727204260.49745: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49775: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49814: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 28285 1727204260.49837: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49874: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 28285 1727204260.49877: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49929: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.49995: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 28285 1727204260.50016: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50050: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 28285 1727204260.50053: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50097: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50151: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 28285 1727204260.50154: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50201: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50259: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 28285 1727204260.50264: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50473: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50685: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 28285 1727204260.50745: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50802: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 28285 1727204260.50827: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50872: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 28285 1727204260.50875: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50923: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.50945: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 28285 1727204260.50969: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51002: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 28285 1727204260.51132: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51175: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available <<< 28285 1727204260.51194: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 28285 1727204260.51224: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51279: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 28285 1727204260.51306: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51320: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28285 1727204260.51354: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51397: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51455: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51535: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 28285 1727204260.51580: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51644: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 28285 1727204260.51650: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51791: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.51971: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 28285 1727204260.51990: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.52003: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.52054: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 28285 1727204260.52057: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.52098: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.52140: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 28285 1727204260.52143: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.52214: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.52292: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 28285 1727204260.52370: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.52451: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 28285 1727204260.52536: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204260.53278: stdout chunk (state=3): >>>import 'gc' # <<< 28285 1727204260.53770: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 28285 1727204260.53775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 28285 1727204260.53796: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 28285 1727204260.53799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 28285 1727204260.53856: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3015190> <<< 28285 1727204260.53859: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3015b20> <<< 28285 1727204260.53978: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2fc1d30> <<< 28285 1727204261.59531: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 28285 1727204261.59586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 28285 1727204261.59614: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3015550><<< 28285 1727204261.59629: stdout chunk (state=3): >>> <<< 28285 1727204261.59674: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py<<< 28285 1727204261.59682: stdout chunk (state=3): >>> <<< 28285 1727204261.59705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc'<<< 28285 1727204261.59709: stdout chunk (state=3): >>> <<< 28285 1727204261.59755: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2fd75b0> <<< 28285 1727204261.59831: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py<<< 28285 1727204261.59839: stdout chunk (state=3): >>> <<< 28285 1727204261.59867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc'<<< 28285 1727204261.59873: stdout chunk (state=3): >>> <<< 28285 1727204261.59917: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 28285 1727204261.59931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc'<<< 28285 1727204261.59936: stdout chunk (state=3): >>> <<< 28285 1727204261.59967: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2e2ba60><<< 28285 1727204261.59973: stdout chunk (state=3): >>> <<< 28285 1727204261.59999: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2e2b640><<< 28285 1727204261.60005: stdout chunk (state=3): >>> <<< 28285 1727204261.60451: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28285 1727204261.60455: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28285 1727204261.60457: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28285 1727204261.60460: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28285 1727204261.60462: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 28285 1727204261.80574: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "40", "epoch": "1727204260", "epoch_int": "1727204260", "date": "2024-09-24", "time": "14:57:40", "iso8601_micro": "2024-09-24T18:57:40.537158Z", "iso8601": "2024-09-24T18:57:40Z", "iso8601_basic": "20240924T145740537158", "iso8601_basic_short": "20240924T145740", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "an<<< 28285 1727204261.80617: stdout chunk (state=3): >>>sible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.55, "5m": 0.43, "15m": 0.23}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2791, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 741, "free": 2791}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "use<<< 28285 1727204261.80646: stdout chunk (state=3): >>>d": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 523, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271618048, "block_size": 4096, "block_total": 65519355, "block_available": 64519438, "block_used": 999917, "inode_total": 131071472, "inode_available": 130998225, "inode_used": 73247, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28285 1727204261.81194: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 28285 1727204261.81479: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle <<< 28285 1727204261.81520: stdout chunk (state=3): >>># cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 28285 1727204261.81775: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28285 1727204261.81809: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 28285 1727204261.81876: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 28285 1727204261.81891: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 28285 1727204261.81912: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 28285 1727204261.81959: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 28285 1727204261.82036: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 28285 1727204261.82061: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 28285 1727204261.82103: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 <<< 28285 1727204261.82143: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 28285 1727204261.82190: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 28285 1727204261.82319: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle <<< 28285 1727204261.82415: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 28285 1727204261.82478: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 28285 1727204261.82547: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 28285 1727204261.82567: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 28285 1727204261.82749: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 28285 1727204261.82816: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 28285 1727204261.82840: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 28285 1727204261.82843: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 28285 1727204261.82873: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 28285 1727204261.83281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 28285 1727204261.83284: stdout chunk (state=3): >>><<< 28285 1727204261.83287: stderr chunk (state=3): >>><<< 28285 1727204261.83421: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4491dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c44363a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4491b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4491ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c444e880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4436970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c416df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c416e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c416d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c4055e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4055940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4055f40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4055d90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4149dc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41426a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4155700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4175eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c4066d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c41492e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c4155310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c417ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066e20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066d90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4039400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c40394f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c406ef70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4068ac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4068490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f6d250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4024550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4068f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c417b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f7fb80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f7feb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f907c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f90d00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f1e430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f7ffa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f2f310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f90640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f2f3d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066a60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4b730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4ba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f4b7f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4b8e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f4bd30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3f55280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f4b970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f3eac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c4066640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3f4bb20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f35c3e6a700> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3da9160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da94f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9dc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3da9580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3da9100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d69f70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3748370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3748070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3748cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d91dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d913a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d91f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3ddef40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dacd60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dac430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d5caf0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3dac550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dac580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37b6fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3df0280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37b3820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3df0400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3df0c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37b37c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3d891c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3df09d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3df0550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3de9940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37a8910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37c5dc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37b2550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37a8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37b2970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37ee7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37f38b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c334f940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d67730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3dab2e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c37e5880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c31e3ac0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37f6910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d7b970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3d65850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c30d0c70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3330a30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c33309a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c337cb20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c337c550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c33642e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3364970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c33152b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3315a00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3315940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c31310d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c37e13a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3364670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3020eb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c30209d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c308cbb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3049a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3093040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c30936d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_9i9e9o9o/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35c3015190> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3015b20> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2fc1d30> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c3015550> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2fd75b0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2e2ba60> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35c2e2b640> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "40", "epoch": "1727204260", "epoch_int": "1727204260", "date": "2024-09-24", "time": "14:57:40", "iso8601_micro": "2024-09-24T18:57:40.537158Z", "iso8601": "2024-09-24T18:57:40Z", "iso8601_basic": "20240924T145740537158", "iso8601_basic_short": "20240924T145740", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.55, "5m": 0.43, "15m": 0.23}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2791, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 741, "free": 2791}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 523, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271618048, "block_size": 4096, "block_total": 65519355, "block_available": 64519438, "block_used": 999917, "inode_total": 131071472, "inode_available": 130998225, "inode_used": 73247, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 28285 1727204261.84908: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28285 1727204261.84911: _low_level_execute_command(): starting 28285 1727204261.84970: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204259.3099487-28387-144073599747838/ > /dev/null 2>&1 && sleep 0' 28285 1727204261.85693: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204261.85707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204261.85720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204261.85738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204261.85798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204261.85810: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204261.85824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204261.85841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204261.85862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204261.85883: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204261.85897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204261.85911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204261.85928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204261.85941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204261.85956: stderr chunk (state=3): >>>debug2: match found <<< 28285 1727204261.85978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204261.86068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204261.86104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204261.86123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204261.86223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28285 1727204261.87966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204261.88065: stderr chunk (state=3): >>><<< 28285 1727204261.88078: stdout chunk (state=3): >>><<< 28285 1727204261.88183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28285 1727204261.88188: handler run complete 28285 1727204261.88308: variable 'ansible_facts' from source: unknown 28285 1727204261.88375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204261.88753: variable 'ansible_facts' from source: unknown 28285 1727204261.88842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204261.89009: attempt loop complete, returning result 28285 1727204261.89019: _execute() done 28285 1727204261.89026: dumping result to json 28285 1727204261.89064: done dumping result, returning 28285 1727204261.89089: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-57a1-d976-0000000001aa] 28285 1727204261.89099: sending task result for task 0affcd87-79f5-57a1-d976-0000000001aa ok: [managed-node1] 28285 1727204261.89769: no more pending results, returning what we have 28285 1727204261.89773: results queue empty 28285 1727204261.89774: checking for any_errors_fatal 28285 1727204261.89785: done checking for any_errors_fatal 28285 1727204261.89786: checking for max_fail_percentage 28285 1727204261.89788: done checking for max_fail_percentage 28285 1727204261.89789: checking to see if all hosts have failed and the running result is not ok 28285 1727204261.89790: done checking to see if all hosts have failed 28285 1727204261.89791: getting the remaining hosts for this loop 28285 1727204261.89793: done getting the remaining hosts for this loop 28285 1727204261.89801: getting the next task for host managed-node1 28285 1727204261.89807: done getting next task for host managed-node1 28285 1727204261.89809: ^ task is: TASK: meta (flush_handlers) 28285 1727204261.89811: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204261.89815: getting variables 28285 1727204261.89817: in VariableManager get_vars() 28285 1727204261.89839: Calling all_inventory to load vars for managed-node1 28285 1727204261.89842: Calling groups_inventory to load vars for managed-node1 28285 1727204261.89845: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204261.89855: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001aa 28285 1727204261.89858: WORKER PROCESS EXITING 28285 1727204261.89870: Calling all_plugins_play to load vars for managed-node1 28285 1727204261.89873: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204261.89876: Calling groups_plugins_play to load vars for managed-node1 28285 1727204261.90036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204261.90152: done with get_vars() 28285 1727204261.90160: done getting variables 28285 1727204261.90208: in VariableManager get_vars() 28285 1727204261.90217: Calling all_inventory to load vars for managed-node1 28285 1727204261.90218: Calling groups_inventory to load vars for managed-node1 28285 1727204261.90220: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204261.90223: Calling all_plugins_play to load vars for managed-node1 28285 1727204261.90224: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204261.90231: Calling groups_plugins_play to load vars for managed-node1 28285 1727204261.90310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204261.90416: done with get_vars() 28285 1727204261.90427: done queuing things up, now waiting for results queue to drain 28285 1727204261.90429: results queue empty 28285 1727204261.90430: checking for any_errors_fatal 28285 1727204261.90432: done checking for any_errors_fatal 28285 1727204261.90433: checking for max_fail_percentage 28285 1727204261.90434: done checking for max_fail_percentage 28285 1727204261.90434: checking to see if all hosts have failed and the running result is not ok 28285 1727204261.90435: done checking to see if all hosts have failed 28285 1727204261.90435: getting the remaining hosts for this loop 28285 1727204261.90436: done getting the remaining hosts for this loop 28285 1727204261.90438: getting the next task for host managed-node1 28285 1727204261.90441: done getting next task for host managed-node1 28285 1727204261.90442: ^ task is: TASK: Include the task 'el_repo_setup.yml' 28285 1727204261.90443: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204261.90445: getting variables 28285 1727204261.90445: in VariableManager get_vars() 28285 1727204261.90451: Calling all_inventory to load vars for managed-node1 28285 1727204261.90453: Calling groups_inventory to load vars for managed-node1 28285 1727204261.90454: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204261.90457: Calling all_plugins_play to load vars for managed-node1 28285 1727204261.90458: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204261.90460: Calling groups_plugins_play to load vars for managed-node1 28285 1727204261.90552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204261.90657: done with get_vars() 28285 1727204261.90662: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethtool_features_initscripts.yml:10 Tuesday 24 September 2024 14:57:41 -0400 (0:00:02.694) 0:00:02.717 ***** 28285 1727204261.90716: entering _queue_task() for managed-node1/include_tasks 28285 1727204261.90717: Creating lock for include_tasks 28285 1727204261.90925: worker is 1 (out of 1 available) 28285 1727204261.90937: exiting _queue_task() for managed-node1/include_tasks 28285 1727204261.90949: done queuing things up, now waiting for results queue to drain 28285 1727204261.90950: waiting for pending results... 28285 1727204261.91092: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 28285 1727204261.91151: in run() - task 0affcd87-79f5-57a1-d976-000000000006 28285 1727204261.91168: variable 'ansible_search_path' from source: unknown 28285 1727204261.91195: calling self._execute() 28285 1727204261.91253: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204261.91258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204261.91271: variable 'omit' from source: magic vars 28285 1727204261.91347: _execute() done 28285 1727204261.91353: dumping result to json 28285 1727204261.91388: done dumping result, returning 28285 1727204261.91393: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-57a1-d976-000000000006] 28285 1727204261.91396: sending task result for task 0affcd87-79f5-57a1-d976-000000000006 28285 1727204261.91566: done sending task result for task 0affcd87-79f5-57a1-d976-000000000006 28285 1727204261.91569: WORKER PROCESS EXITING 28285 1727204261.91636: no more pending results, returning what we have 28285 1727204261.91640: in VariableManager get_vars() 28285 1727204261.91668: Calling all_inventory to load vars for managed-node1 28285 1727204261.91670: Calling groups_inventory to load vars for managed-node1 28285 1727204261.91673: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204261.91681: Calling all_plugins_play to load vars for managed-node1 28285 1727204261.91684: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204261.91686: Calling groups_plugins_play to load vars for managed-node1 28285 1727204261.91857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204261.92056: done with get_vars() 28285 1727204261.92065: variable 'ansible_search_path' from source: unknown 28285 1727204261.92078: we have included files to process 28285 1727204261.92079: generating all_blocks data 28285 1727204261.92080: done generating all_blocks data 28285 1727204261.92081: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28285 1727204261.92083: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28285 1727204261.92085: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28285 1727204261.92737: in VariableManager get_vars() 28285 1727204261.92754: done with get_vars() 28285 1727204261.92768: done processing included file 28285 1727204261.92770: iterating over new_blocks loaded from include file 28285 1727204261.92772: in VariableManager get_vars() 28285 1727204261.92781: done with get_vars() 28285 1727204261.92782: filtering new block on tags 28285 1727204261.92796: done filtering new block on tags 28285 1727204261.92799: in VariableManager get_vars() 28285 1727204261.92808: done with get_vars() 28285 1727204261.92809: filtering new block on tags 28285 1727204261.92822: done filtering new block on tags 28285 1727204261.92824: in VariableManager get_vars() 28285 1727204261.92834: done with get_vars() 28285 1727204261.92836: filtering new block on tags 28285 1727204261.92850: done filtering new block on tags 28285 1727204261.92853: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 28285 1727204261.92858: extending task lists for all hosts with included blocks 28285 1727204261.92905: done extending task lists 28285 1727204261.92907: done processing included files 28285 1727204261.92908: results queue empty 28285 1727204261.92908: checking for any_errors_fatal 28285 1727204261.92910: done checking for any_errors_fatal 28285 1727204261.92910: checking for max_fail_percentage 28285 1727204261.92911: done checking for max_fail_percentage 28285 1727204261.92912: checking to see if all hosts have failed and the running result is not ok 28285 1727204261.92912: done checking to see if all hosts have failed 28285 1727204261.92913: getting the remaining hosts for this loop 28285 1727204261.92914: done getting the remaining hosts for this loop 28285 1727204261.92916: getting the next task for host managed-node1 28285 1727204261.92920: done getting next task for host managed-node1 28285 1727204261.92922: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 28285 1727204261.92924: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204261.92926: getting variables 28285 1727204261.92927: in VariableManager get_vars() 28285 1727204261.92934: Calling all_inventory to load vars for managed-node1 28285 1727204261.92936: Calling groups_inventory to load vars for managed-node1 28285 1727204261.92938: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204261.92943: Calling all_plugins_play to load vars for managed-node1 28285 1727204261.92946: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204261.92951: Calling groups_plugins_play to load vars for managed-node1 28285 1727204261.93102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204261.93285: done with get_vars() 28285 1727204261.93293: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.026) 0:00:02.743 ***** 28285 1727204261.93356: entering _queue_task() for managed-node1/setup 28285 1727204261.93613: worker is 1 (out of 1 available) 28285 1727204261.93625: exiting _queue_task() for managed-node1/setup 28285 1727204261.93639: done queuing things up, now waiting for results queue to drain 28285 1727204261.93640: waiting for pending results... 28285 1727204261.93883: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 28285 1727204261.93957: in run() - task 0affcd87-79f5-57a1-d976-0000000001bb 28285 1727204261.93969: variable 'ansible_search_path' from source: unknown 28285 1727204261.93972: variable 'ansible_search_path' from source: unknown 28285 1727204261.93999: calling self._execute() 28285 1727204261.94055: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204261.94059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204261.94071: variable 'omit' from source: magic vars 28285 1727204261.94461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204261.96242: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204261.96316: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204261.96361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204261.96403: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204261.96436: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204261.96555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204261.96592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204261.96659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204261.96725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204261.96744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204261.96941: variable 'ansible_facts' from source: unknown 28285 1727204261.97019: variable 'network_test_required_facts' from source: task vars 28285 1727204261.97072: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 28285 1727204261.97086: variable 'omit' from source: magic vars 28285 1727204261.97127: variable 'omit' from source: magic vars 28285 1727204261.97168: variable 'omit' from source: magic vars 28285 1727204261.97198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28285 1727204261.97229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28285 1727204261.97255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28285 1727204261.97279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204261.97294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204261.97324: variable 'inventory_hostname' from source: host vars for 'managed-node1' 28285 1727204261.97335: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204261.97342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204261.97442: Set connection var ansible_shell_executable to /bin/sh 28285 1727204261.97459: Set connection var ansible_pipelining to False 28285 1727204261.97477: Set connection var ansible_timeout to 10 28285 1727204261.97485: Set connection var ansible_shell_type to sh 28285 1727204261.97495: Set connection var ansible_connection to ssh 28285 1727204261.97519: Set connection var ansible_module_compression to ZIP_DEFLATED 28285 1727204261.97562: variable 'ansible_shell_executable' from source: unknown 28285 1727204261.97574: variable 'ansible_connection' from source: unknown 28285 1727204261.97582: variable 'ansible_module_compression' from source: unknown 28285 1727204261.97599: variable 'ansible_shell_type' from source: unknown 28285 1727204261.97618: variable 'ansible_shell_executable' from source: unknown 28285 1727204261.97642: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204261.97659: variable 'ansible_pipelining' from source: unknown 28285 1727204261.97676: variable 'ansible_timeout' from source: unknown 28285 1727204261.97700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204261.97886: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28285 1727204261.97891: variable 'omit' from source: magic vars 28285 1727204261.97905: starting attempt loop 28285 1727204261.97912: running the handler 28285 1727204261.97923: _low_level_execute_command(): starting 28285 1727204261.97929: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28285 1727204261.98416: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204261.98446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204261.98465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204261.98508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204261.98520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204261.98588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.00851: stdout chunk (state=3): >>>/root <<< 28285 1727204262.00999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204262.01094: stderr chunk (state=3): >>><<< 28285 1727204262.01098: stdout chunk (state=3): >>><<< 28285 1727204262.01100: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204262.01120: _low_level_execute_command(): starting 28285 1727204262.01135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541 `" && echo ansible-tmp-1727204262.011042-28500-165936024065541="` echo /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541 `" ) && sleep 0' 28285 1727204262.01848: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204262.01874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204262.01893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.01913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.01958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204262.01978: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204262.01996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.02015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204262.02028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204262.02040: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204262.02052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204262.02071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.02092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.02109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204262.02121: stderr chunk (state=3): >>>debug2: match found <<< 28285 1727204262.02135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.02223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204262.02246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204262.02266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.02362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.04917: stdout chunk (state=3): >>>ansible-tmp-1727204262.011042-28500-165936024065541=/root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541 <<< 28285 1727204262.05090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204262.05197: stderr chunk (state=3): >>><<< 28285 1727204262.05203: stdout chunk (state=3): >>><<< 28285 1727204262.05206: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204262.011042-28500-165936024065541=/root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204262.05257: variable 'ansible_module_compression' from source: unknown 28285 1727204262.05385: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28285ojofnqq2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28285 1727204262.05388: variable 'ansible_facts' from source: unknown 28285 1727204262.05562: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541/AnsiballZ_setup.py 28285 1727204262.05755: Sending initial data 28285 1727204262.05758: Sent initial data (153 bytes) 28285 1727204262.06697: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.06700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.06739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.06743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.06745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.06799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204262.06802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204262.06821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.06880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.09325: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28285 1727204262.09381: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28285 1727204262.09442: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28285ojofnqq2/tmp7jem3d52 /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541/AnsiballZ_setup.py <<< 28285 1727204262.09546: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28285 1727204262.11417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204262.11701: stderr chunk (state=3): >>><<< 28285 1727204262.11705: stdout chunk (state=3): >>><<< 28285 1727204262.11707: done transferring module to remote 28285 1727204262.11709: _low_level_execute_command(): starting 28285 1727204262.11711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541/ /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541/AnsiballZ_setup.py && sleep 0' 28285 1727204262.12362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204262.12390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.12439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204262.12443: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.12445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.12447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.12499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204262.12520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.12611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.15088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204262.15148: stderr chunk (state=3): >>><<< 28285 1727204262.15151: stdout chunk (state=3): >>><<< 28285 1727204262.15245: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204262.15248: _low_level_execute_command(): starting 28285 1727204262.15251: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541/AnsiballZ_setup.py && sleep 0' 28285 1727204262.15844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204262.15872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204262.15898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.15924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.16028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204262.16040: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204262.16055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.16076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204262.16089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204262.16107: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204262.16134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204262.16155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.16197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.16246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204262.16255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204262.16259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.16317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.19037: stdout chunk (state=3): >>>import _frozen_importlib # frozen<<< 28285 1727204262.19042: stdout chunk (state=3): >>> import _imp # builtin <<< 28285 1727204262.19082: stdout chunk (state=3): >>>import '_thread' # <<< 28285 1727204262.19086: stdout chunk (state=3): >>>import '_warnings' # <<< 28285 1727204262.19100: stdout chunk (state=3): >>>import '_weakref' # <<< 28285 1727204262.19189: stdout chunk (state=3): >>>import '_io' # <<< 28285 1727204262.19209: stdout chunk (state=3): >>>import 'marshal' # <<< 28285 1727204262.19270: stdout chunk (state=3): >>>import 'posix' # <<< 28285 1727204262.19319: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 28285 1727204262.19332: stdout chunk (state=3): >>># installing zipimport hook <<< 28285 1727204262.19387: stdout chunk (state=3): >>>import 'time' # <<< 28285 1727204262.19400: stdout chunk (state=3): >>> <<< 28285 1727204262.19416: stdout chunk (state=3): >>>import 'zipimport' # <<< 28285 1727204262.19421: stdout chunk (state=3): >>># installed zipimport hook <<< 28285 1727204262.19504: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py<<< 28285 1727204262.19508: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc'<<< 28285 1727204262.19513: stdout chunk (state=3): >>> <<< 28285 1727204262.19546: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py<<< 28285 1727204262.19555: stdout chunk (state=3): >>> <<< 28285 1727204262.19580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 28285 1727204262.19604: stdout chunk (state=3): >>>import '_codecs' # <<< 28285 1727204262.19608: stdout chunk (state=3): >>> <<< 28285 1727204262.19654: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bf3dc0><<< 28285 1727204262.19660: stdout chunk (state=3): >>> <<< 28285 1727204262.19715: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 28285 1727204262.19738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc'<<< 28285 1727204262.19751: stdout chunk (state=3): >>> <<< 28285 1727204262.19781: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b983a0><<< 28285 1727204262.19786: stdout chunk (state=3): >>> <<< 28285 1727204262.19789: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bf3b20><<< 28285 1727204262.19791: stdout chunk (state=3): >>> <<< 28285 1727204262.19836: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py<<< 28285 1727204262.19840: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc'<<< 28285 1727204262.19842: stdout chunk (state=3): >>> <<< 28285 1727204262.19879: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bf3ac0><<< 28285 1727204262.19883: stdout chunk (state=3): >>> <<< 28285 1727204262.19915: stdout chunk (state=3): >>>import '_signal' # <<< 28285 1727204262.19920: stdout chunk (state=3): >>> <<< 28285 1727204262.19959: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py<<< 28285 1727204262.19963: stdout chunk (state=3): >>> <<< 28285 1727204262.19967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc'<<< 28285 1727204262.19969: stdout chunk (state=3): >>> <<< 28285 1727204262.19998: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98490><<< 28285 1727204262.20001: stdout chunk (state=3): >>> <<< 28285 1727204262.20032: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py<<< 28285 1727204262.20044: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 28285 1727204262.20074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py<<< 28285 1727204262.20091: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc'<<< 28285 1727204262.20094: stdout chunk (state=3): >>> <<< 28285 1727204262.20124: stdout chunk (state=3): >>>import '_abc' # <<< 28285 1727204262.20137: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98940><<< 28285 1727204262.20143: stdout chunk (state=3): >>> <<< 28285 1727204262.20177: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98670><<< 28285 1727204262.20181: stdout chunk (state=3): >>> <<< 28285 1727204262.20216: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py<<< 28285 1727204262.20236: stdout chunk (state=3): >>> <<< 28285 1727204262.20239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc'<<< 28285 1727204262.20244: stdout chunk (state=3): >>> <<< 28285 1727204262.20279: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py<<< 28285 1727204262.20282: stdout chunk (state=3): >>> <<< 28285 1727204262.20309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc'<<< 28285 1727204262.20313: stdout chunk (state=3): >>> <<< 28285 1727204262.20340: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py<<< 28285 1727204262.20345: stdout chunk (state=3): >>> <<< 28285 1727204262.20369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc'<<< 28285 1727204262.20382: stdout chunk (state=3): >>> <<< 28285 1727204262.20405: stdout chunk (state=3): >>>import '_stat' # <<< 28285 1727204262.20417: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b4f190><<< 28285 1727204262.20420: stdout chunk (state=3): >>> <<< 28285 1727204262.20450: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py<<< 28285 1727204262.20456: stdout chunk (state=3): >>> <<< 28285 1727204262.20490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc'<<< 28285 1727204262.20494: stdout chunk (state=3): >>> <<< 28285 1727204262.20605: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b4f220><<< 28285 1727204262.20609: stdout chunk (state=3): >>> <<< 28285 1727204262.20642: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py<<< 28285 1727204262.20650: stdout chunk (state=3): >>> <<< 28285 1727204262.20673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 28285 1727204262.20712: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py<<< 28285 1727204262.20727: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc'<<< 28285 1727204262.20738: stdout chunk (state=3): >>> import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b72850><<< 28285 1727204262.20747: stdout chunk (state=3): >>> <<< 28285 1727204262.20760: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b4f940><<< 28285 1727204262.20770: stdout chunk (state=3): >>> <<< 28285 1727204262.20818: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bb0880><<< 28285 1727204262.20822: stdout chunk (state=3): >>> <<< 28285 1727204262.20850: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py<<< 28285 1727204262.20855: stdout chunk (state=3): >>> <<< 28285 1727204262.20873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 28285 1727204262.20890: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b48d90> <<< 28285 1727204262.20966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py<<< 28285 1727204262.20972: stdout chunk (state=3): >>> <<< 28285 1727204262.20993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc'<<< 28285 1727204262.21015: stdout chunk (state=3): >>> <<< 28285 1727204262.21029: stdout chunk (state=3): >>>import '_locale' # <<< 28285 1727204262.21031: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b72d90><<< 28285 1727204262.21033: stdout chunk (state=3): >>> <<< 28285 1727204262.21122: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98970><<< 28285 1727204262.21127: stdout chunk (state=3): >>> <<< 28285 1727204262.21177: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) <<< 28285 1727204262.21181: stdout chunk (state=3): >>> [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information.<<< 28285 1727204262.21183: stdout chunk (state=3): >>> <<< 28285 1727204262.21736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py<<< 28285 1727204262.21745: stdout chunk (state=3): >>> <<< 28285 1727204262.21779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc'<<< 28285 1727204262.21783: stdout chunk (state=3): >>> <<< 28285 1727204262.21807: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py<<< 28285 1727204262.21814: stdout chunk (state=3): >>> <<< 28285 1727204262.21830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc'<<< 28285 1727204262.21835: stdout chunk (state=3): >>> <<< 28285 1727204262.21870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py<<< 28285 1727204262.21875: stdout chunk (state=3): >>> <<< 28285 1727204262.21903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc'<<< 28285 1727204262.21909: stdout chunk (state=3): >>> <<< 28285 1727204262.21936: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py<<< 28285 1727204262.21941: stdout chunk (state=3): >>> <<< 28285 1727204262.21967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc'<<< 28285 1727204262.21972: stdout chunk (state=3): >>> <<< 28285 1727204262.21995: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05aeff10><<< 28285 1727204262.21999: stdout chunk (state=3): >>> <<< 28285 1727204262.22076: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05af40a0> <<< 28285 1727204262.22103: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py<<< 28285 1727204262.22112: stdout chunk (state=3): >>> <<< 28285 1727204262.22125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc'<<< 28285 1727204262.22129: stdout chunk (state=3): >>> <<< 28285 1727204262.22155: stdout chunk (state=3): >>>import '_sre' # <<< 28285 1727204262.22158: stdout chunk (state=3): >>> <<< 28285 1727204262.22188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py<<< 28285 1727204262.22192: stdout chunk (state=3): >>> <<< 28285 1727204262.22216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 28285 1727204262.22249: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py<<< 28285 1727204262.22254: stdout chunk (state=3): >>> <<< 28285 1727204262.22268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 28285 1727204262.22302: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ae75b0><<< 28285 1727204262.22311: stdout chunk (state=3): >>> <<< 28285 1727204262.22327: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05aee6a0><<< 28285 1727204262.22332: stdout chunk (state=3): >>> <<< 28285 1727204262.22353: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05aef3d0><<< 28285 1727204262.22358: stdout chunk (state=3): >>> <<< 28285 1727204262.22390: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py<<< 28285 1727204262.22397: stdout chunk (state=3): >>> <<< 28285 1727204262.22491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc'<<< 28285 1727204262.22494: stdout chunk (state=3): >>> <<< 28285 1727204262.22528: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py<<< 28285 1727204262.22533: stdout chunk (state=3): >>> <<< 28285 1727204262.22583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc'<<< 28285 1727204262.22586: stdout chunk (state=3): >>> <<< 28285 1727204262.22619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py<<< 28285 1727204262.22634: stdout chunk (state=3): >>> <<< 28285 1727204262.22637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 28285 1727204262.22696: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.22715: stdout chunk (state=3): >>> # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.22718: stdout chunk (state=3): >>> import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05792eb0><<< 28285 1727204262.22720: stdout chunk (state=3): >>> <<< 28285 1727204262.22723: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057929a0><<< 28285 1727204262.22724: stdout chunk (state=3): >>> <<< 28285 1727204262.22749: stdout chunk (state=3): >>>import 'itertools' # <<< 28285 1727204262.22756: stdout chunk (state=3): >>> <<< 28285 1727204262.22795: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py<<< 28285 1727204262.22811: stdout chunk (state=3): >>> <<< 28285 1727204262.22814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 28285 1727204262.22818: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05792fa0><<< 28285 1727204262.22820: stdout chunk (state=3): >>> <<< 28285 1727204262.22861: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py<<< 28285 1727204262.22870: stdout chunk (state=3): >>> <<< 28285 1727204262.22887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc'<<< 28285 1727204262.22891: stdout chunk (state=3): >>> <<< 28285 1727204262.22923: stdout chunk (state=3): >>>import '_operator' # <<< 28285 1727204262.22940: stdout chunk (state=3): >>> <<< 28285 1727204262.22947: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05792df0> <<< 28285 1727204262.22997: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py<<< 28285 1727204262.23001: stdout chunk (state=3): >>> <<< 28285 1727204262.23018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc'<<< 28285 1727204262.23021: stdout chunk (state=3): >>> <<< 28285 1727204262.23023: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2160><<< 28285 1727204262.23025: stdout chunk (state=3): >>> <<< 28285 1727204262.23053: stdout chunk (state=3): >>>import '_collections' # <<< 28285 1727204262.23058: stdout chunk (state=3): >>> <<< 28285 1727204262.23119: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ac9e20><<< 28285 1727204262.23126: stdout chunk (state=3): >>> <<< 28285 1727204262.23142: stdout chunk (state=3): >>>import '_functools' # <<< 28285 1727204262.23148: stdout chunk (state=3): >>> <<< 28285 1727204262.23196: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ac1700> <<< 28285 1727204262.23287: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 28285 1727204262.23316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 28285 1727204262.23319: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ad5760> <<< 28285 1727204262.23322: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05af5eb0><<< 28285 1727204262.23323: stdout chunk (state=3): >>> <<< 28285 1727204262.23369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py<<< 28285 1727204262.23374: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 28285 1727204262.23444: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.23456: stdout chunk (state=3): >>> # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.23459: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e057a2d60><<< 28285 1727204262.23461: stdout chunk (state=3): >>> <<< 28285 1727204262.23462: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ac9340><<< 28285 1727204262.23466: stdout chunk (state=3): >>> <<< 28285 1727204262.23517: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.23543: stdout chunk (state=3): >>> <<< 28285 1727204262.23548: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.23557: stdout chunk (state=3): >>> import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05ad5370><<< 28285 1727204262.23562: stdout chunk (state=3): >>> <<< 28285 1727204262.23571: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05afba60> <<< 28285 1727204262.23602: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py<<< 28285 1727204262.23620: stdout chunk (state=3): >>> <<< 28285 1727204262.23623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc'<<< 28285 1727204262.23625: stdout chunk (state=3): >>> <<< 28285 1727204262.23679: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 28285 1727204262.23685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204262.23717: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 28285 1727204262.23737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc'<<< 28285 1727204262.23742: stdout chunk (state=3): >>> <<< 28285 1727204262.23782: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2f40><<< 28285 1727204262.23785: stdout chunk (state=3): >>> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2e80><<< 28285 1727204262.23786: stdout chunk (state=3): >>> <<< 28285 1727204262.23836: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py<<< 28285 1727204262.23840: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc'<<< 28285 1727204262.23850: stdout chunk (state=3): >>> import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2df0><<< 28285 1727204262.23860: stdout chunk (state=3): >>> <<< 28285 1727204262.23907: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 28285 1727204262.23919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 28285 1727204262.23961: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py<<< 28285 1727204262.23967: stdout chunk (state=3): >>> <<< 28285 1727204262.23985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc'<<< 28285 1727204262.23990: stdout chunk (state=3): >>> <<< 28285 1727204262.24030: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 28285 1727204262.24112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 28285 1727204262.24174: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py<<< 28285 1727204262.24178: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 28285 1727204262.24180: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05776460> <<< 28285 1727204262.24216: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 28285 1727204262.24241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 28285 1727204262.24292: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05776550><<< 28285 1727204262.24296: stdout chunk (state=3): >>> <<< 28285 1727204262.24481: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057540d0><<< 28285 1727204262.24486: stdout chunk (state=3): >>> <<< 28285 1727204262.24538: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a5b20><<< 28285 1727204262.24541: stdout chunk (state=3): >>> <<< 28285 1727204262.24565: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a54c0> <<< 28285 1727204262.24596: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py<<< 28285 1727204262.24605: stdout chunk (state=3): >>> <<< 28285 1727204262.24618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 28285 1727204262.24679: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 28285 1727204262.24705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 28285 1727204262.24736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py<<< 28285 1727204262.24745: stdout chunk (state=3): >>> <<< 28285 1727204262.24766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 28285 1727204262.24787: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056c42b0> <<< 28285 1727204262.24842: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05761d60><<< 28285 1727204262.24846: stdout chunk (state=3): >>> <<< 28285 1727204262.24928: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a5fa0><<< 28285 1727204262.24940: stdout chunk (state=3): >>> <<< 28285 1727204262.24945: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05afb0d0> <<< 28285 1727204262.24988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py<<< 28285 1727204262.24993: stdout chunk (state=3): >>> <<< 28285 1727204262.25034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc'<<< 28285 1727204262.25038: stdout chunk (state=3): >>> <<< 28285 1727204262.25068: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py<<< 28285 1727204262.25090: stdout chunk (state=3): >>> <<< 28285 1727204262.25101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 28285 1727204262.25104: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056d4be0><<< 28285 1727204262.25110: stdout chunk (state=3): >>> <<< 28285 1727204262.25130: stdout chunk (state=3): >>>import 'errno' # <<< 28285 1727204262.25133: stdout chunk (state=3): >>> <<< 28285 1727204262.25202: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.25205: stdout chunk (state=3): >>> # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.25206: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056d4f10><<< 28285 1727204262.25208: stdout chunk (state=3): >>> <<< 28285 1727204262.25240: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py<<< 28285 1727204262.25244: stdout chunk (state=3): >>> <<< 28285 1727204262.25266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc'<<< 28285 1727204262.25270: stdout chunk (state=3): >>> <<< 28285 1727204262.25297: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py<<< 28285 1727204262.25321: stdout chunk (state=3): >>> <<< 28285 1727204262.25326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 28285 1727204262.25346: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056e7820><<< 28285 1727204262.25349: stdout chunk (state=3): >>> <<< 28285 1727204262.25383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py<<< 28285 1727204262.25387: stdout chunk (state=3): >>> <<< 28285 1727204262.25435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc'<<< 28285 1727204262.25439: stdout chunk (state=3): >>> <<< 28285 1727204262.25484: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056e7d60><<< 28285 1727204262.25488: stdout chunk (state=3): >>> <<< 28285 1727204262.25543: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.25550: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.25565: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05680490> <<< 28285 1727204262.25584: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056d4f40><<< 28285 1727204262.25589: stdout chunk (state=3): >>> <<< 28285 1727204262.25617: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py<<< 28285 1727204262.25621: stdout chunk (state=3): >>> <<< 28285 1727204262.25642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc'<<< 28285 1727204262.25645: stdout chunk (state=3): >>> <<< 28285 1727204262.25710: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.25722: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.25725: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05690370><<< 28285 1727204262.25727: stdout chunk (state=3): >>> <<< 28285 1727204262.25753: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056e76a0> <<< 28285 1727204262.25776: stdout chunk (state=3): >>>import 'pwd' # <<< 28285 1727204262.25820: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.25832: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.25835: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05690430><<< 28285 1727204262.25837: stdout chunk (state=3): >>> <<< 28285 1727204262.25895: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2ac0><<< 28285 1727204262.25900: stdout chunk (state=3): >>> <<< 28285 1727204262.25926: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py<<< 28285 1727204262.25931: stdout chunk (state=3): >>> <<< 28285 1727204262.25965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc'<<< 28285 1727204262.25971: stdout chunk (state=3): >>> <<< 28285 1727204262.26010: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py<<< 28285 1727204262.26015: stdout chunk (state=3): >>> <<< 28285 1727204262.26040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc'<<< 28285 1727204262.26044: stdout chunk (state=3): >>> <<< 28285 1727204262.26102: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.26113: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056ac790><<< 28285 1727204262.26124: stdout chunk (state=3): >>> <<< 28285 1727204262.26151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py<<< 28285 1727204262.26169: stdout chunk (state=3): >>> <<< 28285 1727204262.26172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc'<<< 28285 1727204262.26176: stdout chunk (state=3): >>> <<< 28285 1727204262.26246: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.26253: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.26255: stdout chunk (state=3): >>> import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056aca60><<< 28285 1727204262.26256: stdout chunk (state=3): >>> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056ac850><<< 28285 1727204262.26258: stdout chunk (state=3): >>> <<< 28285 1727204262.26314: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.26320: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056ac940> <<< 28285 1727204262.26374: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py<<< 28285 1727204262.26377: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc'<<< 28285 1727204262.26382: stdout chunk (state=3): >>> <<< 28285 1727204262.26651: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.26654: stdout chunk (state=3): >>> # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056acd90><<< 28285 1727204262.26660: stdout chunk (state=3): >>> <<< 28285 1727204262.26716: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.26719: stdout chunk (state=3): >>> # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.26731: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056b62e0> <<< 28285 1727204262.26743: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056ac9d0> <<< 28285 1727204262.26776: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056a0b20><<< 28285 1727204262.26779: stdout chunk (state=3): >>> <<< 28285 1727204262.26818: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a26a0><<< 28285 1727204262.26823: stdout chunk (state=3): >>> <<< 28285 1727204262.26862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py<<< 28285 1727204262.26868: stdout chunk (state=3): >>> <<< 28285 1727204262.26945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc'<<< 28285 1727204262.26952: stdout chunk (state=3): >>> <<< 28285 1727204262.27003: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056acb80><<< 28285 1727204262.27006: stdout chunk (state=3): >>> <<< 28285 1727204262.27220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc'<<< 28285 1727204262.27225: stdout chunk (state=3): >>> <<< 28285 1727204262.27253: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5e055da760><<< 28285 1727204262.27259: stdout chunk (state=3): >>> <<< 28285 1727204262.27734: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip' <<< 28285 1727204262.27739: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.27895: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.27940: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/__init__.py <<< 28285 1727204262.27945: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.27976: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.27981: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 28285 1727204262.28011: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.29832: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.29838: stdout chunk (state=3): >>> <<< 28285 1727204262.31354: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 28285 1727204262.31357: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec8b0> <<< 28285 1727204262.31414: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204262.31472: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 28285 1727204262.31510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 28285 1727204262.31529: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 28285 1727204262.31594: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.31598: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04fec160> <<< 28285 1727204262.31663: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec280> <<< 28285 1727204262.31717: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec5e0> <<< 28285 1727204262.31771: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py<<< 28285 1727204262.31788: stdout chunk (state=3): >>> <<< 28285 1727204262.31791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 28285 1727204262.31884: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fece20><<< 28285 1727204262.31887: stdout chunk (state=3): >>> <<< 28285 1727204262.31899: stdout chunk (state=3): >>>import 'atexit' # <<< 28285 1727204262.31954: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.31981: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04fec580> <<< 28285 1727204262.32000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 28285 1727204262.32039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 28285 1727204262.32129: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec100> <<< 28285 1727204262.32191: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 28285 1727204262.32235: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 28285 1727204262.32302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 28285 1727204262.32414: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f81040> <<< 28285 1727204262.32488: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.32495: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.32513: stdout chunk (state=3): >>> import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04ec93d0> <<< 28285 1727204262.32543: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.32559: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04ec90d0><<< 28285 1727204262.32562: stdout chunk (state=3): >>> <<< 28285 1727204262.32593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py<<< 28285 1727204262.32612: stdout chunk (state=3): >>> <<< 28285 1727204262.32620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 28285 1727204262.32673: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ec9d30> <<< 28285 1727204262.32703: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fd4d90><<< 28285 1727204262.32708: stdout chunk (state=3): >>> <<< 28285 1727204262.33000: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fd43a0><<< 28285 1727204262.33005: stdout chunk (state=3): >>> <<< 28285 1727204262.33029: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py<<< 28285 1727204262.33045: stdout chunk (state=3): >>> <<< 28285 1727204262.33053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 28285 1727204262.33090: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fd4f40><<< 28285 1727204262.33095: stdout chunk (state=3): >>> <<< 28285 1727204262.33120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py<<< 28285 1727204262.33125: stdout chunk (state=3): >>> <<< 28285 1727204262.33147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc'<<< 28285 1727204262.33156: stdout chunk (state=3): >>> <<< 28285 1727204262.33204: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 28285 1727204262.33207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 28285 1727204262.33242: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py<<< 28285 1727204262.33247: stdout chunk (state=3): >>> <<< 28285 1727204262.33274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc'<<< 28285 1727204262.33278: stdout chunk (state=3): >>> <<< 28285 1727204262.33316: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py<<< 28285 1727204262.33327: stdout chunk (state=3): >>> <<< 28285 1727204262.33331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 28285 1727204262.33343: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e055daa90><<< 28285 1727204262.33350: stdout chunk (state=3): >>> <<< 28285 1727204262.33465: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04faadc0><<< 28285 1727204262.33469: stdout chunk (state=3): >>> <<< 28285 1727204262.33470: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04faa490> <<< 28285 1727204262.33493: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fe9a90> <<< 28285 1727204262.33541: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.33559: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04faa5b0><<< 28285 1727204262.33562: stdout chunk (state=3): >>> <<< 28285 1727204262.33622: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py <<< 28285 1727204262.33625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04faa5e0><<< 28285 1727204262.33631: stdout chunk (state=3): >>> <<< 28285 1727204262.33674: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py<<< 28285 1727204262.33678: stdout chunk (state=3): >>> <<< 28285 1727204262.33698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc'<<< 28285 1727204262.33707: stdout chunk (state=3): >>> <<< 28285 1727204262.33741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 28285 1727204262.33808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc'<<< 28285 1727204262.33810: stdout chunk (state=3): >>> <<< 28285 1727204262.33924: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.33930: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.33933: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f34f70><<< 28285 1727204262.33943: stdout chunk (state=3): >>> <<< 28285 1727204262.33952: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e055602e0> <<< 28285 1727204262.33989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py<<< 28285 1727204262.33994: stdout chunk (state=3): >>> <<< 28285 1727204262.34016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc'<<< 28285 1727204262.34021: stdout chunk (state=3): >>> <<< 28285 1727204262.34118: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.34121: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.34123: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f317f0><<< 28285 1727204262.34125: stdout chunk (state=3): >>> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05560460><<< 28285 1727204262.34126: stdout chunk (state=3): >>> <<< 28285 1727204262.34166: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py<<< 28285 1727204262.34174: stdout chunk (state=3): >>> <<< 28285 1727204262.34221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204262.34268: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py<<< 28285 1727204262.34290: stdout chunk (state=3): >>> <<< 28285 1727204262.34293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 28285 1727204262.34299: stdout chunk (state=3): >>>import '_string' # <<< 28285 1727204262.34301: stdout chunk (state=3): >>> <<< 28285 1727204262.34389: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05560c40> <<< 28285 1727204262.34758: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f31790> <<< 28285 1727204262.34762: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05560130> <<< 28285 1727204262.34830: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.34834: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05560670><<< 28285 1727204262.34836: stdout chunk (state=3): >>> <<< 28285 1727204262.34914: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.34917: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.34922: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05560730> <<< 28285 1727204262.34951: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e055599a0> <<< 28285 1727204262.34998: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 28285 1727204262.35035: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 28285 1727204262.35077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 28285 1727204262.35157: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.35160: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f278e0><<< 28285 1727204262.35167: stdout chunk (state=3): >>> <<< 28285 1727204262.35487: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.35490: stdout chunk (state=3): >>> <<< 28285 1727204262.35497: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204262.35500: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f44c70> <<< 28285 1727204262.35514: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f30520> <<< 28285 1727204262.35598: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.35606: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.35610: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f27e80><<< 28285 1727204262.35614: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f30940><<< 28285 1727204262.35617: stdout chunk (state=3): >>> <<< 28285 1727204262.35653: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.35657: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.35673: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 28285 1727204262.35700: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.35827: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.36018: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.36042: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 28285 1727204262.36094: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 28285 1727204262.36108: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.36260: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.36422: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.37158: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.37166: stdout chunk (state=3): >>> <<< 28285 1727204262.37935: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 28285 1727204262.37955: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 28285 1727204262.37977: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 28285 1727204262.37981: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py<<< 28285 1727204262.37986: stdout chunk (state=3): >>> <<< 28285 1727204262.38019: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 28285 1727204262.38039: stdout chunk (state=3): >>> <<< 28285 1727204262.38042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc'<<< 28285 1727204262.38047: stdout chunk (state=3): >>> <<< 28285 1727204262.38134: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.38139: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f40790><<< 28285 1727204262.38142: stdout chunk (state=3): >>> <<< 28285 1727204262.38238: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py<<< 28285 1727204262.38243: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 28285 1727204262.38269: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f7f850> <<< 28285 1727204262.38290: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ae5fa0><<< 28285 1727204262.38294: stdout chunk (state=3): >>> <<< 28285 1727204262.38356: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py<<< 28285 1727204262.38364: stdout chunk (state=3): >>> <<< 28285 1727204262.38385: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.38414: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.38429: stdout chunk (state=3): >>> <<< 28285 1727204262.38451: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/_text.py<<< 28285 1727204262.38457: stdout chunk (state=3): >>> <<< 28285 1727204262.38483: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.38492: stdout chunk (state=3): >>> <<< 28285 1727204262.38702: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.38897: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py<<< 28285 1727204262.38902: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc'<<< 28285 1727204262.38905: stdout chunk (state=3): >>> <<< 28285 1727204262.38963: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fb2310><<< 28285 1727204262.38970: stdout chunk (state=3): >>> <<< 28285 1727204262.38972: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.38975: stdout chunk (state=3): >>> <<< 28285 1727204262.39624: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40221: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40315: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40424: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 28285 1727204262.40441: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40494: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40561: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 28285 1727204262.40585: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40676: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40802: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 28285 1727204262.40853: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28285 1727204262.40871: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 28285 1727204262.40885: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.40945: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.41008: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 28285 1727204262.41027: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.41327: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.41651: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 28285 1727204262.41706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 28285 1727204262.41709: stdout chunk (state=3): >>>import '_ast' # <<< 28285 1727204262.41716: stdout chunk (state=3): >>> <<< 28285 1727204262.41831: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ff2ca0> <<< 28285 1727204262.41835: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.41841: stdout chunk (state=3): >>> <<< 28285 1727204262.41959: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.42083: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 28285 1727204262.42096: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 28285 1727204262.42138: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.42197: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.42259: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 28285 1727204262.42278: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.42335: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.42398: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.42530: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.42628: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 28285 1727204262.42678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204262.42812: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f64c70><<< 28285 1727204262.42818: stdout chunk (state=3): >>> <<< 28285 1727204262.42941: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ff2bb0> <<< 28285 1727204262.42996: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 28285 1727204262.43022: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.43110: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.43213: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.43247: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.43344: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 28285 1727204262.43347: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 28285 1727204262.43394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 28285 1727204262.43446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 28285 1727204262.43511: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 28285 1727204262.43528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 28285 1727204262.43665: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f422b0> <<< 28285 1727204262.43730: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fc0b80><<< 28285 1727204262.43735: stdout chunk (state=3): >>> <<< 28285 1727204262.43837: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04940160><<< 28285 1727204262.43844: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py<<< 28285 1727204262.43852: stdout chunk (state=3): >>> <<< 28285 1727204262.43876: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.43930: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.43985: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py <<< 28285 1727204262.43988: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py<<< 28285 1727204262.43990: stdout chunk (state=3): >>> <<< 28285 1727204262.44130: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/basic.py<<< 28285 1727204262.44135: stdout chunk (state=3): >>> <<< 28285 1727204262.44166: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44169: stdout chunk (state=3): >>> <<< 28285 1727204262.44191: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.44216: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 28285 1727204262.44256: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44261: stdout chunk (state=3): >>> <<< 28285 1727204262.44368: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44370: stdout chunk (state=3): >>> <<< 28285 1727204262.44465: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.44496: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44499: stdout chunk (state=3): >>> <<< 28285 1727204262.44554: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44559: stdout chunk (state=3): >>> <<< 28285 1727204262.44633: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44636: stdout chunk (state=3): >>> <<< 28285 1727204262.44700: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44709: stdout chunk (state=3): >>> <<< 28285 1727204262.44758: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.44809: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 28285 1727204262.44831: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44836: stdout chunk (state=3): >>> <<< 28285 1727204262.44955: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.44961: stdout chunk (state=3): >>> <<< 28285 1727204262.45100: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.45117: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.45176: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 28285 1727204262.45195: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.45461: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.45690: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.45755: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.45825: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc'<<< 28285 1727204262.45843: stdout chunk (state=3): >>> <<< 28285 1727204262.45889: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 28285 1727204262.45918: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 28285 1727204262.45943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 28285 1727204262.45999: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04842100> <<< 28285 1727204262.46039: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 28285 1727204262.46061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 28285 1727204262.46106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 28285 1727204262.46140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 28285 1727204262.46178: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py<<< 28285 1727204262.46228: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aa6a60> <<< 28285 1727204262.46326: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04aa69d0><<< 28285 1727204262.46369: stdout chunk (state=3): >>> <<< 28285 1727204262.46452: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a7ac70> <<< 28285 1727204262.46468: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a7ac10> <<< 28285 1727204262.46518: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aebbb0> <<< 28285 1727204262.46566: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aebc40> <<< 28285 1727204262.46603: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc'<<< 28285 1727204262.46655: stdout chunk (state=3): >>> # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 28285 1727204262.46677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 28285 1727204262.46728: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04a8a310> <<< 28285 1727204262.46812: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a8a9a0> <<< 28285 1727204262.46825: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 28285 1727204262.46896: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a8a940> <<< 28285 1727204262.46908: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 28285 1727204262.47343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e048a40d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05569c40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aeb880> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 28285 1727204262.47427: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 28285 1727204262.47446: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.47514: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.47605: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 28285 1727204262.47635: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.47676: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 28285 1727204262.47696: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.47728: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.47789: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 28285 1727204262.47808: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.47869: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.47941: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 28285 1727204262.47953: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.48008: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.48080: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 28285 1727204262.48097: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.48188: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.48276: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.48353: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.48479: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 28285 1727204262.48483: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 28285 1727204262.48494: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.49145: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.49773: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 28285 1727204262.49785: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.49866: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.49941: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.49946: stdout chunk (state=3): >>> <<< 28285 1727204262.49995: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.50000: stdout chunk (state=3): >>> <<< 28285 1727204262.50051: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 28285 1727204262.50061: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 28285 1727204262.50084: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.50090: stdout chunk (state=3): >>> <<< 28285 1727204262.50137: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50192: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 28285 1727204262.50211: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50287: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50375: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 28285 1727204262.50389: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50445: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50493: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 28285 1727204262.50512: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50562: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50630: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 28285 1727204262.50634: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50739: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.50857: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 28285 1727204262.50910: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0478cf10> <<< 28285 1727204262.50947: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py<<< 28285 1727204262.50950: stdout chunk (state=3): >>> <<< 28285 1727204262.50998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 28285 1727204262.51273: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0478c9d0> <<< 28285 1727204262.51309: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 28285 1727204262.51410: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.51510: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 28285 1727204262.51516: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.51639: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.51783: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 28285 1727204262.51787: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.51878: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.52001: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 28285 1727204262.52005: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.52053: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.52131: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 28285 1727204262.52168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 28285 1727204262.52405: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e047b6c10><<< 28285 1727204262.52408: stdout chunk (state=3): >>> <<< 28285 1727204262.52845: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04806c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 28285 1727204262.52858: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.52945: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53034: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 28285 1727204262.53053: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53174: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53297: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53442: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53677: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/version.py <<< 28285 1727204262.53680: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 28285 1727204262.53696: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53742: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53816: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 28285 1727204262.53833: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53884: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.53967: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 28285 1727204262.54077: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so'<<< 28285 1727204262.54102: stdout chunk (state=3): >>> import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e048085e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04808790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 28285 1727204262.54126: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.54164: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.54186: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 28285 1727204262.54200: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.54254: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.54326: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 28285 1727204262.54341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.54554: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.54775: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 28285 1727204262.54789: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.54927: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55073: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55129: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55207: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 28285 1727204262.55224: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 28285 1727204262.55237: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55336: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55377: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55556: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55761: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 28285 1727204262.55780: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 28285 1727204262.55799: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.55961: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.56163: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 28285 1727204262.56169: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.56212: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.56271: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.56989: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.57746: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 28285 1727204262.57750: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 28285 1727204262.57767: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.57920: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.58055: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py<<< 28285 1727204262.58077: stdout chunk (state=3): >>> <<< 28285 1727204262.58082: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58089: stdout chunk (state=3): >>> <<< 28285 1727204262.58217: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58221: stdout chunk (state=3): >>> <<< 28285 1727204262.58348: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py<<< 28285 1727204262.58374: stdout chunk (state=3): >>> <<< 28285 1727204262.58378: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58381: stdout chunk (state=3): >>> <<< 28285 1727204262.58602: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58604: stdout chunk (state=3): >>> <<< 28285 1727204262.58807: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py<<< 28285 1727204262.58825: stdout chunk (state=3): >>> <<< 28285 1727204262.58829: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58844: stdout chunk (state=3): >>> <<< 28285 1727204262.58847: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58860: stdout chunk (state=3): >>> <<< 28285 1727204262.58874: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 28285 1727204262.58897: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58902: stdout chunk (state=3): >>> <<< 28285 1727204262.58956: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.58961: stdout chunk (state=3): >>> <<< 28285 1727204262.59015: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py<<< 28285 1727204262.59031: stdout chunk (state=3): >>> <<< 28285 1727204262.59036: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.59038: stdout chunk (state=3): >>> <<< 28285 1727204262.59171: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.59176: stdout chunk (state=3): >>> <<< 28285 1727204262.59310: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.59315: stdout chunk (state=3): >>> <<< 28285 1727204262.59612: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.59921: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 28285 1727204262.59925: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 28285 1727204262.59944: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.59991: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60043: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 28285 1727204262.60076: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60110: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60159: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 28285 1727204262.60180: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60270: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60375: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 28285 1727204262.60395: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60439: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60518: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 28285 1727204262.60532: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60579: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60678: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 28285 1727204262.60696: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60770: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.60841: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 28285 1727204262.60856: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.60875: stdout chunk (state=3): >>> <<< 28285 1727204262.61216: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.61222: stdout chunk (state=3): >>> <<< 28285 1727204262.61583: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py<<< 28285 1727204262.61592: stdout chunk (state=3): >>> <<< 28285 1727204262.61605: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.61689: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.61694: stdout chunk (state=3): >>> <<< 28285 1727204262.61772: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py<<< 28285 1727204262.61788: stdout chunk (state=3): >>> <<< 28285 1727204262.61791: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.61795: stdout chunk (state=3): >>> <<< 28285 1727204262.61850: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.61856: stdout chunk (state=3): >>> <<< 28285 1727204262.61912: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py<<< 28285 1727204262.61920: stdout chunk (state=3): >>> <<< 28285 1727204262.61934: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.61943: stdout chunk (state=3): >>> <<< 28285 1727204262.61989: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.61994: stdout chunk (state=3): >>> <<< 28285 1727204262.62038: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py<<< 28285 1727204262.62043: stdout chunk (state=3): >>> <<< 28285 1727204262.62066: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.62074: stdout chunk (state=3): >>> <<< 28285 1727204262.62112: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.62117: stdout chunk (state=3): >>> <<< 28285 1727204262.62173: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 28285 1727204262.62189: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.62302: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.62307: stdout chunk (state=3): >>> <<< 28285 1727204262.62421: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 28285 1727204262.62445: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.62452: stdout chunk (state=3): >>> <<< 28285 1727204262.62483: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.62494: stdout chunk (state=3): >>> <<< 28285 1727204262.62502: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 28285 1727204262.62527: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.62533: stdout chunk (state=3): >>> <<< 28285 1727204262.62604: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.62684: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 28285 1727204262.62733: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.62744: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.62769: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.62846: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.62910: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.63009: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.63126: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 28285 1727204262.63161: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 28285 1727204262.63184: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.63241: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.63321: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 28285 1727204262.63335: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.63620: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.63905: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 28285 1727204262.63923: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.63988: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.64085: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 28285 1727204262.64089: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.64238: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 28285 1727204262.64357: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.64513: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 28285 1727204262.64516: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 28285 1727204262.64525: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.64640: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204262.64776: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 28285 1727204262.64793: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py<<< 28285 1727204262.64799: stdout chunk (state=3): >>> <<< 28285 1727204262.64930: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204262.64937: stdout chunk (state=3): >>> <<< 28285 1727204262.65205: stdout chunk (state=3): >>>import 'gc' # <<< 28285 1727204262.65210: stdout chunk (state=3): >>> <<< 28285 1727204262.66521: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 28285 1727204262.66560: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 28285 1727204262.66581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 28285 1727204262.66614: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e045a9220> <<< 28285 1727204262.66632: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0457bc10> <<< 28285 1727204262.66706: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0457b220> <<< 28285 1727204262.67490: stdout chunk (state=3): >>> <<< 28285 1727204262.67519: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public<<< 28285 1727204262.67561: stdout chunk (state=3): >>>": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "42", "epoch": "1727204262", "epoch_int": "1727204262", "date": "2024-09-24", "time": "14:57:42", "iso8601_micro": "2024-09-24T18:57:42.672508Z", "iso8601": "2024-09-24T18:57:42Z", "iso8601_basic": "20240924T145742672508", "iso8601_basic_short": "20240924T145742", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28285 1727204262.68384: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks<<< 28285 1727204262.68521: stdout chunk (state=3): >>> # clear sys.path_importer_cache <<< 28285 1727204262.68679: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs <<< 28285 1727204262.68894: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site<<< 28285 1727204262.69038: stdout chunk (state=3): >>> # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections<<< 28285 1727204262.69148: stdout chunk (state=3): >>> # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap<<< 28285 1727204262.69250: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 28285 1727204262.69395: stdout chunk (state=3): >>># cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib<<< 28285 1727204262.69463: stdout chunk (state=3): >>> # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 28285 1727204262.69594: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random<<< 28285 1727204262.69712: stdout chunk (state=3): >>> # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ <<< 28285 1727204262.69756: stdout chunk (state=3): >>># destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale<<< 28285 1727204262.69759: stdout chunk (state=3): >>> # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token<<< 28285 1727204262.69793: stdout chunk (state=3): >>> # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string<<< 28285 1727204262.69814: stdout chunk (state=3): >>> # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six<<< 28285 1727204262.69858: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy <<< 28285 1727204262.69862: stdout chunk (state=3): >>># destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters<<< 28285 1727204262.69889: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux<<< 28285 1727204262.69926: stdout chunk (state=3): >>> # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic<<< 28285 1727204262.69986: stdout chunk (state=3): >>> # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass<<< 28285 1727204262.70023: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly <<< 28285 1727204262.70115: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux<<< 28285 1727204262.70122: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr<<< 28285 1727204262.70204: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux<<< 28285 1727204262.70210: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux<<< 28285 1727204262.70370: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 28285 1727204262.70927: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog <<< 28285 1727204262.70930: stdout chunk (state=3): >>># destroy uuid <<< 28285 1727204262.71001: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 28285 1727204262.71109: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 28285 1727204262.71157: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context <<< 28285 1727204262.71171: stdout chunk (state=3): >>># destroy array # destroy _compat_pickle <<< 28285 1727204262.71236: stdout chunk (state=3): >>># destroy queue <<< 28285 1727204262.71266: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 28285 1727204262.71289: stdout chunk (state=3): >>># destroy shlex <<< 28285 1727204262.71314: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 28285 1727204262.71355: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 28285 1727204262.71378: stdout chunk (state=3): >>># destroy getpass # destroy json <<< 28285 1727204262.71425: stdout chunk (state=3): >>># destroy socket # destroy struct<<< 28285 1727204262.71472: stdout chunk (state=3): >>> # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 28285 1727204262.71551: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 28285 1727204262.71571: stdout chunk (state=3): >>> # cleanup[3] wiping unicodedata # cleanup[3] wiping gc <<< 28285 1727204262.71610: stdout chunk (state=3): >>># cleanup[3] wiping termios <<< 28285 1727204262.71642: stdout chunk (state=3): >>># cleanup[3] wiping _ssl<<< 28285 1727204262.71730: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 28285 1727204262.71841: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string <<< 28285 1727204262.71920: stdout chunk (state=3): >>># cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 28285 1727204262.72055: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 28285 1727204262.72139: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 28285 1727204262.72245: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 28285 1727204262.72417: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools <<< 28285 1727204262.72513: stdout chunk (state=3): >>># destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 28285 1727204262.72699: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8<<< 28285 1727204262.72739: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 28285 1727204262.72847: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing<<< 28285 1727204262.72873: stdout chunk (state=3): >>> # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128<<< 28285 1727204262.73066: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 28285 1727204262.73137: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 28285 1727204262.73156: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 28285 1727204262.73203: stdout chunk (state=3): >>># destroy _heapq <<< 28285 1727204262.73229: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 28285 1727204262.73303: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess <<< 28285 1727204262.73332: stdout chunk (state=3): >>># destroy selectors <<< 28285 1727204262.73440: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools <<< 28285 1727204262.73468: stdout chunk (state=3): >>># destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 28285 1727204262.73524: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 28285 1727204262.73603: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 28285 1727204262.74123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 28285 1727204262.74126: stdout chunk (state=3): >>><<< 28285 1727204262.74129: stderr chunk (state=3): >>><<< 28285 1727204262.74286: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bf3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bf3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bf3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b4f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b4f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05bb0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b48d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b72d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05b98970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05aeff10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05af40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ae75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05aee6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05aef3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05792eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057929a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05792fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05792df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ac9e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ac1700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ad5760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05af5eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e057a2d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05ac9340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05ad5370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05afba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05776460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05776550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057540d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a5b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a54c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056c42b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05761d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a5fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05afb0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056d4be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056d4f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056e7820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056e7d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05680490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056d4f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05690370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056e76a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05690430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a2ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056ac790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056aca60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056ac850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056ac940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056acd90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e056b62e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056ac9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056a0b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e057a26a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e056acb80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5e055da760> # zipimport: found 103 names in '/tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04fec160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fece20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04fec580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fec100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f81040> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04ec93d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04ec90d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ec9d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fd4d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fd43a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fd4f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e055daa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04faadc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04faa490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fe9a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04faa5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04faa5e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f34f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e055602e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f317f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05560460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05560c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f31790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05560130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05560670> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e05560730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e055599a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f278e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f44c70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f30520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f27e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f30940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f40790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f7f850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ae5fa0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fb2310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ff2ca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04f64c70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04ff2bb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04f422b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04fc0b80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04940160> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04842100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aa6a60> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04aa69d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a7ac70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a7ac10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aebbb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aebc40> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e04a8a310> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a8a9a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04a8a940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e048a40d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e05569c40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04aeb880> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0478cf10> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0478c9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e047b6c10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04806c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e048085e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e04808790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_b74bnrok/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e045a9220> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0457bc10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e0457b220> {"ansible_facts": {"ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "42", "epoch": "1727204262", "epoch_int": "1727204262", "date": "2024-09-24", "time": "14:57:42", "iso8601_micro": "2024-09-24T18:57:42.672508Z", "iso8601": "2024-09-24T18:57:42Z", "iso8601_basic": "20240924T145742672508", "iso8601_basic_short": "20240924T145742", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 28285 1727204262.75465: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28285 1727204262.75468: _low_level_execute_command(): starting 28285 1727204262.75471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204262.011042-28500-165936024065541/ > /dev/null 2>&1 && sleep 0' 28285 1727204262.76015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204262.76032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204262.76051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.76074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.76118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204262.76131: stderr chunk (state=3): >>>debug2: match not found <<< 28285 1727204262.76146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.76171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28285 1727204262.76185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 28285 1727204262.76196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28285 1727204262.76209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204262.76223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.76239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.76255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 28285 1727204262.76269: stderr chunk (state=3): >>>debug2: match found <<< 28285 1727204262.76284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.76381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204262.76413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204262.76432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.76532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.79054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204262.79103: stderr chunk (state=3): >>><<< 28285 1727204262.79107: stdout chunk (state=3): >>><<< 28285 1727204262.79120: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204262.79125: handler run complete 28285 1727204262.79165: variable 'ansible_facts' from source: unknown 28285 1727204262.79202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204262.79280: variable 'ansible_facts' from source: unknown 28285 1727204262.79310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204262.79343: attempt loop complete, returning result 28285 1727204262.79346: _execute() done 28285 1727204262.79353: dumping result to json 28285 1727204262.79362: done dumping result, returning 28285 1727204262.79371: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-57a1-d976-0000000001bb] 28285 1727204262.79375: sending task result for task 0affcd87-79f5-57a1-d976-0000000001bb 28285 1727204262.79514: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001bb 28285 1727204262.79516: WORKER PROCESS EXITING ok: [managed-node1] 28285 1727204262.79618: no more pending results, returning what we have 28285 1727204262.79621: results queue empty 28285 1727204262.79622: checking for any_errors_fatal 28285 1727204262.79623: done checking for any_errors_fatal 28285 1727204262.79624: checking for max_fail_percentage 28285 1727204262.79625: done checking for max_fail_percentage 28285 1727204262.79627: checking to see if all hosts have failed and the running result is not ok 28285 1727204262.79628: done checking to see if all hosts have failed 28285 1727204262.79628: getting the remaining hosts for this loop 28285 1727204262.79630: done getting the remaining hosts for this loop 28285 1727204262.79633: getting the next task for host managed-node1 28285 1727204262.79641: done getting next task for host managed-node1 28285 1727204262.79643: ^ task is: TASK: Check if system is ostree 28285 1727204262.79646: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204262.79651: getting variables 28285 1727204262.79652: in VariableManager get_vars() 28285 1727204262.79687: Calling all_inventory to load vars for managed-node1 28285 1727204262.79690: Calling groups_inventory to load vars for managed-node1 28285 1727204262.79693: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204262.79702: Calling all_plugins_play to load vars for managed-node1 28285 1727204262.79707: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204262.79709: Calling groups_plugins_play to load vars for managed-node1 28285 1727204262.79817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204262.79955: done with get_vars() 28285 1727204262.79962: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.866) 0:00:03.610 ***** 28285 1727204262.80039: entering _queue_task() for managed-node1/stat 28285 1727204262.80251: worker is 1 (out of 1 available) 28285 1727204262.80266: exiting _queue_task() for managed-node1/stat 28285 1727204262.80278: done queuing things up, now waiting for results queue to drain 28285 1727204262.80279: waiting for pending results... 28285 1727204262.80414: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 28285 1727204262.80529: in run() - task 0affcd87-79f5-57a1-d976-0000000001bd 28285 1727204262.80561: variable 'ansible_search_path' from source: unknown 28285 1727204262.80574: variable 'ansible_search_path' from source: unknown 28285 1727204262.80658: calling self._execute() 28285 1727204262.80738: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204262.80757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204262.80783: variable 'omit' from source: magic vars 28285 1727204262.81351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28285 1727204262.81705: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28285 1727204262.81746: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28285 1727204262.81773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28285 1727204262.81800: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28285 1727204262.81867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28285 1727204262.81887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28285 1727204262.81906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204262.81923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28285 1727204262.82019: Evaluated conditional (not __network_is_ostree is defined): True 28285 1727204262.82023: variable 'omit' from source: magic vars 28285 1727204262.82056: variable 'omit' from source: magic vars 28285 1727204262.82083: variable 'omit' from source: magic vars 28285 1727204262.82101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28285 1727204262.82123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28285 1727204262.82137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28285 1727204262.82153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204262.82160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204262.82186: variable 'inventory_hostname' from source: host vars for 'managed-node1' 28285 1727204262.82189: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204262.82192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204262.82258: Set connection var ansible_shell_executable to /bin/sh 28285 1727204262.82263: Set connection var ansible_pipelining to False 28285 1727204262.82274: Set connection var ansible_timeout to 10 28285 1727204262.82276: Set connection var ansible_shell_type to sh 28285 1727204262.82280: Set connection var ansible_connection to ssh 28285 1727204262.82288: Set connection var ansible_module_compression to ZIP_DEFLATED 28285 1727204262.82303: variable 'ansible_shell_executable' from source: unknown 28285 1727204262.82307: variable 'ansible_connection' from source: unknown 28285 1727204262.82309: variable 'ansible_module_compression' from source: unknown 28285 1727204262.82311: variable 'ansible_shell_type' from source: unknown 28285 1727204262.82313: variable 'ansible_shell_executable' from source: unknown 28285 1727204262.82316: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204262.82319: variable 'ansible_pipelining' from source: unknown 28285 1727204262.82323: variable 'ansible_timeout' from source: unknown 28285 1727204262.82325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204262.82428: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 28285 1727204262.82439: variable 'omit' from source: magic vars 28285 1727204262.82442: starting attempt loop 28285 1727204262.82444: running the handler 28285 1727204262.82454: _low_level_execute_command(): starting 28285 1727204262.82461: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28285 1727204262.82974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.82979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.83006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.83010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.83012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.83063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204262.83070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.83146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.85287: stdout chunk (state=3): >>>/root <<< 28285 1727204262.85457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204262.85495: stderr chunk (state=3): >>><<< 28285 1727204262.85497: stdout chunk (state=3): >>><<< 28285 1727204262.85574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204262.85586: _low_level_execute_command(): starting 28285 1727204262.85589: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793 `" && echo ansible-tmp-1727204262.855118-28541-67791356262793="` echo /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793 `" ) && sleep 0' 28285 1727204262.85986: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 28285 1727204262.85989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.86015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.86034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.86037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.86091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204262.86094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.86166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204262.88806: stdout chunk (state=3): >>>ansible-tmp-1727204262.855118-28541-67791356262793=/root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793 <<< 28285 1727204262.88985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204262.89033: stderr chunk (state=3): >>><<< 28285 1727204262.89038: stdout chunk (state=3): >>><<< 28285 1727204262.89058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204262.855118-28541-67791356262793=/root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204262.89101: variable 'ansible_module_compression' from source: unknown 28285 1727204262.89144: ANSIBALLZ: Using lock for stat 28285 1727204262.89152: ANSIBALLZ: Acquiring lock 28285 1727204262.89154: ANSIBALLZ: Lock acquired: 140647065762176 28285 1727204262.89157: ANSIBALLZ: Creating module 28285 1727204262.97867: ANSIBALLZ: Writing module into payload 28285 1727204262.97945: ANSIBALLZ: Writing module 28285 1727204262.97962: ANSIBALLZ: Renaming module 28285 1727204262.97973: ANSIBALLZ: Done creating module 28285 1727204262.97992: variable 'ansible_facts' from source: unknown 28285 1727204262.98044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793/AnsiballZ_stat.py 28285 1727204262.98166: Sending initial data 28285 1727204262.98179: Sent initial data (151 bytes) 28285 1727204262.98900: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.98904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204262.98940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.98943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204262.98946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204262.98994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204262.99006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204262.99087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204263.01532: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28285 1727204263.01585: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 28285 1727204263.01647: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28285ojofnqq2/tmp0eywj02t /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793/AnsiballZ_stat.py <<< 28285 1727204263.01702: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 28285 1727204263.02572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204263.02688: stderr chunk (state=3): >>><<< 28285 1727204263.02692: stdout chunk (state=3): >>><<< 28285 1727204263.02709: done transferring module to remote 28285 1727204263.02721: _low_level_execute_command(): starting 28285 1727204263.02726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793/ /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793/AnsiballZ_stat.py && sleep 0' 28285 1727204263.03225: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204263.03230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204263.03269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204263.03272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204263.03276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204263.03332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204263.03336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204263.03338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204263.03405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204263.05864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204263.05925: stderr chunk (state=3): >>><<< 28285 1727204263.05929: stdout chunk (state=3): >>><<< 28285 1727204263.05943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204263.05947: _low_level_execute_command(): starting 28285 1727204263.05952: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793/AnsiballZ_stat.py && sleep 0' 28285 1727204263.06479: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204263.06483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204263.06518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 28285 1727204263.06537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204263.06540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204263.06579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204263.06591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204263.06669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204263.09650: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 28285 1727204263.09655: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28285 1727204263.09658: stdout chunk (state=3): >>> <<< 28285 1727204263.09742: stdout chunk (state=3): >>>import '_io' # <<< 28285 1727204263.09746: stdout chunk (state=3): >>>import 'marshal' # <<< 28285 1727204263.09800: stdout chunk (state=3): >>>import 'posix' # <<< 28285 1727204263.09841: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28285 1727204263.09907: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 28285 1727204263.09985: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204263.10019: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 28285 1727204263.10037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 28285 1727204263.10058: stdout chunk (state=3): >>>import '_codecs' # <<< 28285 1727204263.10089: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560cd8dc0> <<< 28285 1727204263.10134: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 28285 1727204263.10153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 28285 1727204263.10156: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d3a0> <<< 28285 1727204263.10175: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560cd8b20> <<< 28285 1727204263.10203: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 28285 1727204263.10213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 28285 1727204263.10228: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560cd8ac0> <<< 28285 1727204263.10257: stdout chunk (state=3): >>>import '_signal' # <<< 28285 1727204263.10284: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 28285 1727204263.10289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 28285 1727204263.10305: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d490> <<< 28285 1727204263.10344: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 28285 1727204263.10370: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 28285 1727204263.10392: stdout chunk (state=3): >>>import '_abc' # <<< 28285 1727204263.10407: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d940> <<< 28285 1727204263.10437: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d670> <<< 28285 1727204263.10474: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 28285 1727204263.10486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 28285 1727204263.10519: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 28285 1727204263.10554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 28285 1727204263.10581: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 28285 1727204263.10592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 28285 1727204263.10633: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c34190> <<< 28285 1727204263.10678: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 28285 1727204263.10690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 28285 1727204263.11146: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c34220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c57850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c34940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c95880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c2dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c57d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28285 1727204263.11454: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 28285 1727204263.11482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 28285 1727204263.11516: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 28285 1727204263.11542: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 28285 1727204263.11562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 28285 1727204263.11595: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 28285 1727204263.11622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 28285 1727204263.11633: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609adf10> <<< 28285 1727204263.11687: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609b40a0> <<< 28285 1727204263.11727: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 28285 1727204263.11751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 28285 1727204263.11772: stdout chunk (state=3): >>>import '_sre' # <<< 28285 1727204263.11793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 28285 1727204263.11821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 28285 1727204263.11841: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 28285 1727204263.11887: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609a75b0> <<< 28285 1727204263.11900: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609ae6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609ad3d0> <<< 28285 1727204263.11931: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 28285 1727204263.12007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 28285 1727204263.12042: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 28285 1727204263.12078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204263.12106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 28285 1727204263.12167: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560931eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609319a0> <<< 28285 1727204263.12198: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560931fa0> <<< 28285 1727204263.12240: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 28285 1727204263.12273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 28285 1727204263.12289: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560931df0> <<< 28285 1727204263.12328: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941160> <<< 28285 1727204263.12340: stdout chunk (state=3): >>>import '_collections' # <<< 28285 1727204263.12408: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560989e20> import '_functools' # <<< 28285 1727204263.12445: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560981700> <<< 28285 1727204263.12533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560995760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609b5eb0> <<< 28285 1727204263.12571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 28285 1727204263.12606: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560941d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560989340> <<< 28285 1727204263.12672: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560995370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609bba60> <<< 28285 1727204263.12708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 28285 1727204263.12739: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204263.12790: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 28285 1727204263.12807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941e80> <<< 28285 1727204263.12839: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941df0> <<< 28285 1727204263.12876: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 28285 1727204263.12909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 28285 1727204263.12929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 28285 1727204263.12940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 28285 1727204263.13014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 28285 1727204263.13050: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560915460> <<< 28285 1727204263.13081: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 28285 1727204263.13097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 28285 1727204263.13140: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560915550> <<< 28285 1727204263.13335: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75608f30d0> <<< 28285 1727204263.13392: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560944b20> <<< 28285 1727204263.13424: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609444c0> <<< 28285 1727204263.13442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 28285 1727204263.13487: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 28285 1727204263.13515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 28285 1727204263.13545: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75608492b0> <<< 28285 1727204263.13581: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560900d60> <<< 28285 1727204263.13656: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560944fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609bb0d0> <<< 28285 1727204263.13687: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 28285 1727204263.13705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 28285 1727204263.13741: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 28285 1727204263.13762: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560859be0> <<< 28285 1727204263.13797: stdout chunk (state=3): >>>import 'errno' # <<< 28285 1727204263.13825: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560859f10> <<< 28285 1727204263.13838: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 28285 1727204263.13905: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 28285 1727204263.13908: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756086c820> <<< 28285 1727204263.13920: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 28285 1727204263.13946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 28285 1727204263.13990: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756086cd60> <<< 28285 1727204263.14040: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75607fa490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560859f40> <<< 28285 1727204263.14076: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 28285 1727204263.14144: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756080a370> <<< 28285 1727204263.14193: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756086c6a0> import 'pwd' # <<< 28285 1727204263.14196: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756080a430> <<< 28285 1727204263.14247: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941ac0> <<< 28285 1727204263.14284: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 28285 1727204263.14317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 28285 1727204263.14361: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 28285 1727204263.14402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 28285 1727204263.14497: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.14514: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.14547: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 28285 1727204263.14606: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.14634: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560826850> <<< 28285 1727204263.14699: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.14721: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826940> <<< 28285 1727204263.14777: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 28285 1727204263.15083: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.15094: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826d90> <<< 28285 1727204263.15150: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.15205: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75608302e0> <<< 28285 1727204263.15209: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75608269d0> <<< 28285 1727204263.15221: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756081ab20> <<< 28285 1727204263.15261: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609416a0> <<< 28285 1727204263.15315: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 28285 1727204263.15411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 28285 1727204263.15463: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560826b80> <<< 28285 1727204263.15613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 28285 1727204263.15684: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7560741760> <<< 28285 1727204263.15972: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip' <<< 28285 1727204263.16002: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.16175: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.16186: stdout chunk (state=3): >>> <<< 28285 1727204263.16233: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/__init__.py <<< 28285 1727204263.16266: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.16294: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.16326: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 28285 1727204263.16375: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.18358: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.19993: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 28285 1727204263.20018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 28285 1727204263.20036: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d8b0> <<< 28285 1727204263.20071: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 28285 1727204263.20074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204263.20154: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 28285 1727204263.20157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 28285 1727204263.20204: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 28285 1727204263.20207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 28285 1727204263.20277: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.20289: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756014d160> <<< 28285 1727204263.20366: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d280> <<< 28285 1727204263.20416: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d5e0> <<< 28285 1727204263.20474: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 28285 1727204263.20487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 28285 1727204263.20542: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d4f0> <<< 28285 1727204263.20591: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014de20> <<< 28285 1727204263.20605: stdout chunk (state=3): >>>import 'atexit' # <<< 28285 1727204263.20653: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.20680: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756014d580> <<< 28285 1727204263.20722: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 28285 1727204263.20775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 28285 1727204263.20845: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d100> <<< 28285 1727204263.20879: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 28285 1727204263.20914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 28285 1727204263.20963: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 28285 1727204263.21004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 28285 1727204263.21058: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 28285 1727204263.21079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 28285 1727204263.21681: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75600a4fd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600c2c40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600c2f40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75600c22e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601b5d90> <<< 28285 1727204263.21826: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601b53a0> <<< 28285 1727204263.21874: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 28285 1727204263.21906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 28285 1727204263.21936: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601b5f40> <<< 28285 1727204263.21977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 28285 1727204263.22015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 28285 1727204263.22057: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 28285 1727204263.22072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 28285 1727204263.22106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 28285 1727204263.22138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 28285 1727204263.22188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 28285 1727204263.22223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 28285 1727204263.22230: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560741a90> <<< 28285 1727204263.22361: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560120dc0> <<< 28285 1727204263.22391: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560120490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560157580> <<< 28285 1727204263.22465: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.22482: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75601205b0> <<< 28285 1727204263.22526: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py <<< 28285 1727204263.22561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601205e0> <<< 28285 1727204263.22595: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 28285 1727204263.22625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 28285 1727204263.22674: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 28285 1727204263.22717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 28285 1727204263.22828: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.22853: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560095f70> <<< 28285 1727204263.22879: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601962e0> <<< 28285 1727204263.22916: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 28285 1727204263.22953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 28285 1727204263.23044: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.23089: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600927f0> <<< 28285 1727204263.23121: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560196460> <<< 28285 1727204263.23136: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 28285 1727204263.23200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204263.23238: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 28285 1727204263.23283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 28285 1727204263.23299: stdout chunk (state=3): >>>import '_string' # <<< 28285 1727204263.23402: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601adf40> <<< 28285 1727204263.23668: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560092790> <<< 28285 1727204263.23836: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.23880: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.23883: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600925e0> <<< 28285 1727204263.23959: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.23962: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.23981: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560091550> <<< 28285 1727204263.24052: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.24111: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.24115: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560091490> <<< 28285 1727204263.24127: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756018e9a0> <<< 28285 1727204263.24192: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 28285 1727204263.24197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 28285 1727204263.24233: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 28285 1727204263.24270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 28285 1727204263.24382: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.24385: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75601166a0> <<< 28285 1727204263.24714: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.24753: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.24772: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560115bb0> <<< 28285 1727204263.24793: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601260d0> <<< 28285 1727204263.24837: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.24878: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.24913: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560116100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560159c40> <<< 28285 1727204263.24967: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.24971: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.24983: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 28285 1727204263.25017: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.25132: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.25281: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.25310: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 28285 1727204263.25333: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.25392: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.25408: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 28285 1727204263.25411: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.25560: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.25733: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.26546: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.26554: stdout chunk (state=3): >>> <<< 28285 1727204263.27336: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 28285 1727204263.27395: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 28285 1727204263.27399: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 28285 1727204263.27432: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 28285 1727204263.27473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204263.27568: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.27589: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756005d940> <<< 28285 1727204263.27724: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 28285 1727204263.27755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 28285 1727204263.27758: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560113d30> <<< 28285 1727204263.27773: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756010a7c0> <<< 28285 1727204263.27834: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 28285 1727204263.27868: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.27891: stdout chunk (state=3): >>> <<< 28285 1727204263.27927: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.27967: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available<<< 28285 1727204263.27970: stdout chunk (state=3): >>> <<< 28285 1727204263.28173: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.28399: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 28285 1727204263.28402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 28285 1727204263.28446: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601154c0> <<< 28285 1727204263.28462: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.29094: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.29721: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.29821: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.29824: stdout chunk (state=3): >>> <<< 28285 1727204263.29932: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/collections.py<<< 28285 1727204263.29938: stdout chunk (state=3): >>> <<< 28285 1727204263.29956: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.29998: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.30086: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py<<< 28285 1727204263.30103: stdout chunk (state=3): >>> <<< 28285 1727204263.30106: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.30195: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.30307: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/errors.py<<< 28285 1727204263.30310: stdout chunk (state=3): >>> <<< 28285 1727204263.30335: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.30339: stdout chunk (state=3): >>> <<< 28285 1727204263.30368: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.30372: stdout chunk (state=3): >>> <<< 28285 1727204263.30398: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py<<< 28285 1727204263.30401: stdout chunk (state=3): >>> <<< 28285 1727204263.30908: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available <<< 28285 1727204263.31224: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 28285 1727204263.31285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc'<<< 28285 1727204263.31295: stdout chunk (state=3): >>> <<< 28285 1727204263.31306: stdout chunk (state=3): >>>import '_ast' # <<< 28285 1727204263.31445: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f755fbea940><<< 28285 1727204263.31448: stdout chunk (state=3): >>> <<< 28285 1727204263.31460: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.31562: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.31715: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 28285 1727204263.31719: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 28285 1727204263.31731: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 28285 1727204263.31768: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.31827: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.31914: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 28285 1727204263.31928: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.31991: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.32063: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.32068: stdout chunk (state=3): >>> <<< 28285 1727204263.32210: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.32213: stdout chunk (state=3): >>> <<< 28285 1727204263.32322: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py<<< 28285 1727204263.32329: stdout chunk (state=3): >>> <<< 28285 1727204263.32377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 28285 1727204263.32520: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 28285 1727204263.32524: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75601a0b50> <<< 28285 1727204263.32582: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f755fbe9070> <<< 28285 1727204263.32658: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 28285 1727204263.32662: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 28285 1727204263.32677: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.32843: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.32937: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.32940: stdout chunk (state=3): >>> <<< 28285 1727204263.32982: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.33049: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 28285 1727204263.33082: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 28285 1727204263.33134: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py<<< 28285 1727204263.33137: stdout chunk (state=3): >>> <<< 28285 1727204263.33188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 28285 1727204263.33232: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py<<< 28285 1727204263.33235: stdout chunk (state=3): >>> <<< 28285 1727204263.33277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc'<<< 28285 1727204263.33280: stdout chunk (state=3): >>> <<< 28285 1727204263.33427: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f755fc3a6d0> <<< 28285 1727204263.33503: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560054c10> <<< 28285 1727204263.33643: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75600535b0> # destroy ansible.module_utils.distro<<< 28285 1727204263.33648: stdout chunk (state=3): >>> import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 28285 1727204263.33661: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.33701: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.33751: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py<<< 28285 1727204263.33766: stdout chunk (state=3): >>> import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 28285 1727204263.33869: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 28285 1727204263.33912: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.33948: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.33951: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 28285 1727204263.33984: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.33987: stdout chunk (state=3): >>> <<< 28285 1727204263.34188: stdout chunk (state=3): >>># zipimport: zlib available<<< 28285 1727204263.34191: stdout chunk (state=3): >>> <<< 28285 1727204263.34459: stdout chunk (state=3): >>># zipimport: zlib available <<< 28285 1727204263.34682: stdout chunk (state=3): >>> <<< 28285 1727204263.34685: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 28285 1727204263.34714: stdout chunk (state=3): >>># destroy __main__ <<< 28285 1727204263.35122: stdout chunk (state=3): >>># clear builtins._ <<< 28285 1727204263.35230: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 28285 1727204263.35278: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 28285 1727204263.35346: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 28285 1727204263.35453: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 28285 1727204263.35571: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases <<< 28285 1727204263.35712: stdout chunk (state=3): >>># cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat <<< 28285 1727204263.35886: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale<<< 28285 1727204263.35984: stdout chunk (state=3): >>> # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile <<< 28285 1727204263.36065: stdout chunk (state=3): >>># cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator<<< 28285 1727204263.36103: stdout chunk (state=3): >>> # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 28285 1727204263.36131: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg<<< 28285 1727204263.36157: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 28285 1727204263.36190: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2<<< 28285 1727204263.36220: stdout chunk (state=3): >>> # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible<<< 28285 1727204263.36245: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex <<< 28285 1727204263.36274: stdout chunk (state=3): >>># cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string<<< 28285 1727204263.36297: stdout chunk (state=3): >>> # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common<<< 28285 1727204263.36320: stdout chunk (state=3): >>> # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux<<< 28285 1727204263.36346: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters<<< 28285 1727204263.36378: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process <<< 28285 1727204263.36669: stdout chunk (state=3): >>># destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins <<< 28285 1727204263.36681: stdout chunk (state=3): >>># destroy importlib.util <<< 28285 1727204263.36735: stdout chunk (state=3): >>># destroy importlib.abc # destroy importlib.machinery <<< 28285 1727204263.36847: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib<<< 28285 1727204263.36862: stdout chunk (state=3): >>> # destroy struct # destroy bz2 # destroy lzma <<< 28285 1727204263.36894: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile<<< 28285 1727204263.36986: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux <<< 28285 1727204263.36989: stdout chunk (state=3): >>># destroy hashlib # destroy json.decoder # destroy json.encoder <<< 28285 1727204263.37028: stdout chunk (state=3): >>># destroy json.scanner # destroy _json <<< 28285 1727204263.37056: stdout chunk (state=3): >>># destroy encodings <<< 28285 1727204263.37096: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 28285 1727204263.37129: stdout chunk (state=3): >>># destroy array<<< 28285 1727204263.37141: stdout chunk (state=3): >>> # destroy datetime <<< 28285 1727204263.37196: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json <<< 28285 1727204263.37234: stdout chunk (state=3): >>># destroy shlex # destroy logging # destroy argparse<<< 28285 1727204263.37237: stdout chunk (state=3): >>> <<< 28285 1727204263.37347: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes<<< 28285 1727204263.37417: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 28285 1727204263.37439: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform<<< 28285 1727204263.37479: stdout chunk (state=3): >>> # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select<<< 28285 1727204263.37522: stdout chunk (state=3): >>> # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil<<< 28285 1727204263.37549: stdout chunk (state=3): >>> # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref<<< 28285 1727204263.37579: stdout chunk (state=3): >>> # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum<<< 28285 1727204263.37598: stdout chunk (state=3): >>> # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator<<< 28285 1727204263.37637: stdout chunk (state=3): >>> # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre<<< 28285 1727204263.37667: stdout chunk (state=3): >>> # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 28285 1727204263.37693: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases<<< 28285 1727204263.37712: stdout chunk (state=3): >>> # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external<<< 28285 1727204263.37747: stdout chunk (state=3): >>> # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 28285 1727204263.37768: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy systemd._daemon<<< 28285 1727204263.37779: stdout chunk (state=3): >>> # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib<<< 28285 1727204263.37998: stdout chunk (state=3): >>> # destroy _signal # destroy platform <<< 28285 1727204263.38023: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse <<< 28285 1727204263.38089: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq <<< 28285 1727204263.38154: stdout chunk (state=3): >>># destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno<<< 28285 1727204263.38224: stdout chunk (state=3): >>> # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 28285 1727204263.38312: stdout chunk (state=3): >>># destroy select <<< 28285 1727204263.38342: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator<<< 28285 1727204263.38358: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external<<< 28285 1727204263.38420: stdout chunk (state=3): >>> # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 28285 1727204263.38855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 28285 1727204263.38938: stderr chunk (state=3): >>><<< 28285 1727204263.38941: stdout chunk (state=3): >>><<< 28285 1727204263.39122: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560cd8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560cd8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560cd8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c34190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c34220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c57850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c34940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c95880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c2dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c57d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560c7d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609adf10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609b40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609a75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609ae6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609ad3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560931eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609319a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560931fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560931df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560989e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560981700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560995760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609b5eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560941d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560989340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560995370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609bba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560915460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560915550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75608f30d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560944b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609444c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75608492b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560900d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560944fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609bb0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560859be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560859f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756086c820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756086cd60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75607fa490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560859f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756080a370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756086c6a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756080a430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560941ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560826850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560826d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75608302e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75608269d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756081ab20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75609416a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560826b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7560741760> # zipimport: found 30 names in '/tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756014d160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014de20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756014d580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756014d100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75600a4fd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600c2c40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600c2f40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75600c22e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601b5d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601b53a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601b5f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560741a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560120dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560120490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560157580> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75601205b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601205e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560095f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601962e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600927f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560196460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601adf40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560092790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75600925e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560091550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560091490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756018e9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75601166a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560115bb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601260d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7560116100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560159c40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f756005d940> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560113d30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f756010a7c0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75601154c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f755fbea940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f75601a0b50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f755fbe9070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f755fc3a6d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7560054c10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f75600535b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_0bu67tnd/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 28285 1727204263.39717: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28285 1727204263.39721: _low_level_execute_command(): starting 28285 1727204263.39723: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204262.855118-28541-67791356262793/ > /dev/null 2>&1 && sleep 0' 28285 1727204263.40619: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 28285 1727204263.40623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28285 1727204263.40887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 28285 1727204263.40890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28285 1727204263.40893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28285 1727204263.41049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28285 1727204263.41059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28285 1727204263.41093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28285 1727204263.41174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 28285 1727204263.43796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28285 1727204263.43800: stdout chunk (state=3): >>><<< 28285 1727204263.43803: stderr chunk (state=3): >>><<< 28285 1727204263.43871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 28285 1727204263.43874: handler run complete 28285 1727204263.43877: attempt loop complete, returning result 28285 1727204263.43880: _execute() done 28285 1727204263.43882: dumping result to json 28285 1727204263.43885: done dumping result, returning 28285 1727204263.43888: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [0affcd87-79f5-57a1-d976-0000000001bd] 28285 1727204263.43890: sending task result for task 0affcd87-79f5-57a1-d976-0000000001bd 28285 1727204263.44135: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001bd 28285 1727204263.44139: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 28285 1727204263.44206: no more pending results, returning what we have 28285 1727204263.44209: results queue empty 28285 1727204263.44210: checking for any_errors_fatal 28285 1727204263.44217: done checking for any_errors_fatal 28285 1727204263.44218: checking for max_fail_percentage 28285 1727204263.44220: done checking for max_fail_percentage 28285 1727204263.44220: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.44221: done checking to see if all hosts have failed 28285 1727204263.44222: getting the remaining hosts for this loop 28285 1727204263.44223: done getting the remaining hosts for this loop 28285 1727204263.44226: getting the next task for host managed-node1 28285 1727204263.44231: done getting next task for host managed-node1 28285 1727204263.44235: ^ task is: TASK: Set flag to indicate system is ostree 28285 1727204263.44238: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.44241: getting variables 28285 1727204263.44242: in VariableManager get_vars() 28285 1727204263.44272: Calling all_inventory to load vars for managed-node1 28285 1727204263.44275: Calling groups_inventory to load vars for managed-node1 28285 1727204263.44278: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.44287: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.44290: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.44292: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.44444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.44661: done with get_vars() 28285 1727204263.44674: done getting variables 28285 1727204263.44772: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.647) 0:00:04.258 ***** 28285 1727204263.45027: entering _queue_task() for managed-node1/set_fact 28285 1727204263.45029: Creating lock for set_fact 28285 1727204263.45754: worker is 1 (out of 1 available) 28285 1727204263.45768: exiting _queue_task() for managed-node1/set_fact 28285 1727204263.45779: done queuing things up, now waiting for results queue to drain 28285 1727204263.45781: waiting for pending results... 28285 1727204263.46918: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 28285 1727204263.47382: in run() - task 0affcd87-79f5-57a1-d976-0000000001be 28285 1727204263.47403: variable 'ansible_search_path' from source: unknown 28285 1727204263.47411: variable 'ansible_search_path' from source: unknown 28285 1727204263.47456: calling self._execute() 28285 1727204263.47538: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.47915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.47930: variable 'omit' from source: magic vars 28285 1727204263.48405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28285 1727204263.48729: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28285 1727204263.48784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28285 1727204263.48823: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28285 1727204263.48866: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28285 1727204263.48954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28285 1727204263.48986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28285 1727204263.49015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204263.49044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28285 1727204263.49177: Evaluated conditional (not __network_is_ostree is defined): True 28285 1727204263.49188: variable 'omit' from source: magic vars 28285 1727204263.49231: variable 'omit' from source: magic vars 28285 1727204263.49360: variable '__ostree_booted_stat' from source: set_fact 28285 1727204263.49414: variable 'omit' from source: magic vars 28285 1727204263.49444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28285 1727204263.49480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28285 1727204263.49503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28285 1727204263.49525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204263.49540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204263.49577: variable 'inventory_hostname' from source: host vars for 'managed-node1' 28285 1727204263.49585: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.49592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.49693: Set connection var ansible_shell_executable to /bin/sh 28285 1727204263.49708: Set connection var ansible_pipelining to False 28285 1727204263.49723: Set connection var ansible_timeout to 10 28285 1727204263.49730: Set connection var ansible_shell_type to sh 28285 1727204263.49738: Set connection var ansible_connection to ssh 28285 1727204263.49747: Set connection var ansible_module_compression to ZIP_DEFLATED 28285 1727204263.49781: variable 'ansible_shell_executable' from source: unknown 28285 1727204263.49789: variable 'ansible_connection' from source: unknown 28285 1727204263.49795: variable 'ansible_module_compression' from source: unknown 28285 1727204263.49801: variable 'ansible_shell_type' from source: unknown 28285 1727204263.49807: variable 'ansible_shell_executable' from source: unknown 28285 1727204263.49812: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.49819: variable 'ansible_pipelining' from source: unknown 28285 1727204263.49824: variable 'ansible_timeout' from source: unknown 28285 1727204263.49830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.49932: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28285 1727204263.49945: variable 'omit' from source: magic vars 28285 1727204263.49957: starting attempt loop 28285 1727204263.49965: running the handler 28285 1727204263.49981: handler run complete 28285 1727204263.49993: attempt loop complete, returning result 28285 1727204263.49999: _execute() done 28285 1727204263.50004: dumping result to json 28285 1727204263.50010: done dumping result, returning 28285 1727204263.50019: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [0affcd87-79f5-57a1-d976-0000000001be] 28285 1727204263.50028: sending task result for task 0affcd87-79f5-57a1-d976-0000000001be ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 28285 1727204263.50169: no more pending results, returning what we have 28285 1727204263.50172: results queue empty 28285 1727204263.50172: checking for any_errors_fatal 28285 1727204263.50179: done checking for any_errors_fatal 28285 1727204263.50180: checking for max_fail_percentage 28285 1727204263.50181: done checking for max_fail_percentage 28285 1727204263.50182: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.50183: done checking to see if all hosts have failed 28285 1727204263.50183: getting the remaining hosts for this loop 28285 1727204263.50185: done getting the remaining hosts for this loop 28285 1727204263.50189: getting the next task for host managed-node1 28285 1727204263.50196: done getting next task for host managed-node1 28285 1727204263.50199: ^ task is: TASK: Fix CentOS6 Base repo 28285 1727204263.50201: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.50204: getting variables 28285 1727204263.50205: in VariableManager get_vars() 28285 1727204263.50234: Calling all_inventory to load vars for managed-node1 28285 1727204263.50236: Calling groups_inventory to load vars for managed-node1 28285 1727204263.50239: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.50253: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.50256: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.50259: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.50432: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001be 28285 1727204263.50453: WORKER PROCESS EXITING 28285 1727204263.50459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.50718: done with get_vars() 28285 1727204263.50727: done getting variables 28285 1727204263.50904: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.062) 0:00:04.320 ***** 28285 1727204263.51049: entering _queue_task() for managed-node1/copy 28285 1727204263.51346: worker is 1 (out of 1 available) 28285 1727204263.51368: exiting _queue_task() for managed-node1/copy 28285 1727204263.51380: done queuing things up, now waiting for results queue to drain 28285 1727204263.51382: waiting for pending results... 28285 1727204263.52222: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 28285 1727204263.52336: in run() - task 0affcd87-79f5-57a1-d976-0000000001c0 28285 1727204263.52985: variable 'ansible_search_path' from source: unknown 28285 1727204263.52994: variable 'ansible_search_path' from source: unknown 28285 1727204263.53037: calling self._execute() 28285 1727204263.53125: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.53137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.53154: variable 'omit' from source: magic vars 28285 1727204263.53638: variable 'ansible_distribution' from source: facts 28285 1727204263.54280: Evaluated conditional (ansible_distribution == 'CentOS'): True 28285 1727204263.54413: variable 'ansible_distribution_major_version' from source: facts 28285 1727204263.54429: Evaluated conditional (ansible_distribution_major_version == '6'): False 28285 1727204263.54437: when evaluation is False, skipping this task 28285 1727204263.54443: _execute() done 28285 1727204263.54451: dumping result to json 28285 1727204263.54458: done dumping result, returning 28285 1727204263.54471: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [0affcd87-79f5-57a1-d976-0000000001c0] 28285 1727204263.54481: sending task result for task 0affcd87-79f5-57a1-d976-0000000001c0 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28285 1727204263.54651: no more pending results, returning what we have 28285 1727204263.54655: results queue empty 28285 1727204263.54656: checking for any_errors_fatal 28285 1727204263.54661: done checking for any_errors_fatal 28285 1727204263.54662: checking for max_fail_percentage 28285 1727204263.54665: done checking for max_fail_percentage 28285 1727204263.54665: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.54666: done checking to see if all hosts have failed 28285 1727204263.54667: getting the remaining hosts for this loop 28285 1727204263.54668: done getting the remaining hosts for this loop 28285 1727204263.54672: getting the next task for host managed-node1 28285 1727204263.54678: done getting next task for host managed-node1 28285 1727204263.54681: ^ task is: TASK: Include the task 'enable_epel.yml' 28285 1727204263.54684: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.54687: getting variables 28285 1727204263.54689: in VariableManager get_vars() 28285 1727204263.54717: Calling all_inventory to load vars for managed-node1 28285 1727204263.54720: Calling groups_inventory to load vars for managed-node1 28285 1727204263.54723: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.54737: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.54739: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.54743: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.54911: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001c0 28285 1727204263.54915: WORKER PROCESS EXITING 28285 1727204263.54930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.55133: done with get_vars() 28285 1727204263.55145: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.041) 0:00:04.362 ***** 28285 1727204263.55250: entering _queue_task() for managed-node1/include_tasks 28285 1727204263.55998: worker is 1 (out of 1 available) 28285 1727204263.56010: exiting _queue_task() for managed-node1/include_tasks 28285 1727204263.56024: done queuing things up, now waiting for results queue to drain 28285 1727204263.56025: waiting for pending results... 28285 1727204263.56853: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 28285 1727204263.57136: in run() - task 0affcd87-79f5-57a1-d976-0000000001c1 28285 1727204263.57179: variable 'ansible_search_path' from source: unknown 28285 1727204263.57202: variable 'ansible_search_path' from source: unknown 28285 1727204263.57245: calling self._execute() 28285 1727204263.57392: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.58279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.58303: variable 'omit' from source: magic vars 28285 1727204263.59355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204263.64530: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204263.65136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204263.65188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204263.65229: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204263.65279: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204263.65361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204263.65399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204263.65428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204263.65480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204263.65500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204263.65620: variable '__network_is_ostree' from source: set_fact 28285 1727204263.65984: Evaluated conditional (not __network_is_ostree | d(false)): True 28285 1727204263.65995: _execute() done 28285 1727204263.66002: dumping result to json 28285 1727204263.66008: done dumping result, returning 28285 1727204263.66017: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-57a1-d976-0000000001c1] 28285 1727204263.66028: sending task result for task 0affcd87-79f5-57a1-d976-0000000001c1 28285 1727204263.66158: no more pending results, returning what we have 28285 1727204263.66163: in VariableManager get_vars() 28285 1727204263.66198: Calling all_inventory to load vars for managed-node1 28285 1727204263.66201: Calling groups_inventory to load vars for managed-node1 28285 1727204263.66204: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.66216: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.66219: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.66222: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.66425: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001c1 28285 1727204263.66431: WORKER PROCESS EXITING 28285 1727204263.66445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.66633: done with get_vars() 28285 1727204263.66645: variable 'ansible_search_path' from source: unknown 28285 1727204263.66646: variable 'ansible_search_path' from source: unknown 28285 1727204263.66689: we have included files to process 28285 1727204263.66691: generating all_blocks data 28285 1727204263.66692: done generating all_blocks data 28285 1727204263.66698: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28285 1727204263.66699: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28285 1727204263.66702: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28285 1727204263.68048: done processing included file 28285 1727204263.68165: iterating over new_blocks loaded from include file 28285 1727204263.68167: in VariableManager get_vars() 28285 1727204263.68182: done with get_vars() 28285 1727204263.68183: filtering new block on tags 28285 1727204263.68204: done filtering new block on tags 28285 1727204263.68207: in VariableManager get_vars() 28285 1727204263.68216: done with get_vars() 28285 1727204263.68218: filtering new block on tags 28285 1727204263.68227: done filtering new block on tags 28285 1727204263.68228: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 28285 1727204263.68233: extending task lists for all hosts with included blocks 28285 1727204263.68437: done extending task lists 28285 1727204263.68439: done processing included files 28285 1727204263.68440: results queue empty 28285 1727204263.68440: checking for any_errors_fatal 28285 1727204263.68444: done checking for any_errors_fatal 28285 1727204263.68445: checking for max_fail_percentage 28285 1727204263.68446: done checking for max_fail_percentage 28285 1727204263.68447: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.68448: done checking to see if all hosts have failed 28285 1727204263.68448: getting the remaining hosts for this loop 28285 1727204263.68450: done getting the remaining hosts for this loop 28285 1727204263.68452: getting the next task for host managed-node1 28285 1727204263.68456: done getting next task for host managed-node1 28285 1727204263.68458: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 28285 1727204263.68461: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.68594: getting variables 28285 1727204263.68596: in VariableManager get_vars() 28285 1727204263.68605: Calling all_inventory to load vars for managed-node1 28285 1727204263.68608: Calling groups_inventory to load vars for managed-node1 28285 1727204263.68610: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.68615: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.68623: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.68626: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.68972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.69396: done with get_vars() 28285 1727204263.69405: done getting variables 28285 1727204263.69582: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 28285 1727204263.69882: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.147) 0:00:04.510 ***** 28285 1727204263.70043: entering _queue_task() for managed-node1/command 28285 1727204263.70045: Creating lock for command 28285 1727204263.70627: worker is 1 (out of 1 available) 28285 1727204263.70639: exiting _queue_task() for managed-node1/command 28285 1727204263.70765: done queuing things up, now waiting for results queue to drain 28285 1727204263.70768: waiting for pending results... 28285 1727204263.71308: running TaskExecutor() for managed-node1/TASK: Create EPEL 9 28285 1727204263.71959: in run() - task 0affcd87-79f5-57a1-d976-0000000001db 28285 1727204263.71980: variable 'ansible_search_path' from source: unknown 28285 1727204263.71988: variable 'ansible_search_path' from source: unknown 28285 1727204263.72025: calling self._execute() 28285 1727204263.72103: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.72115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.72128: variable 'omit' from source: magic vars 28285 1727204263.72503: variable 'ansible_distribution' from source: facts 28285 1727204263.72524: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28285 1727204263.72671: variable 'ansible_distribution_major_version' from source: facts 28285 1727204263.72682: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28285 1727204263.72689: when evaluation is False, skipping this task 28285 1727204263.72695: _execute() done 28285 1727204263.72701: dumping result to json 28285 1727204263.72708: done dumping result, returning 28285 1727204263.72718: done running TaskExecutor() for managed-node1/TASK: Create EPEL 9 [0affcd87-79f5-57a1-d976-0000000001db] 28285 1727204263.72734: sending task result for task 0affcd87-79f5-57a1-d976-0000000001db skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28285 1727204263.72915: no more pending results, returning what we have 28285 1727204263.72918: results queue empty 28285 1727204263.72919: checking for any_errors_fatal 28285 1727204263.72920: done checking for any_errors_fatal 28285 1727204263.72921: checking for max_fail_percentage 28285 1727204263.72923: done checking for max_fail_percentage 28285 1727204263.72923: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.72924: done checking to see if all hosts have failed 28285 1727204263.72925: getting the remaining hosts for this loop 28285 1727204263.72927: done getting the remaining hosts for this loop 28285 1727204263.72930: getting the next task for host managed-node1 28285 1727204263.72938: done getting next task for host managed-node1 28285 1727204263.72940: ^ task is: TASK: Install yum-utils package 28285 1727204263.72944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.72948: getting variables 28285 1727204263.72949: in VariableManager get_vars() 28285 1727204263.72979: Calling all_inventory to load vars for managed-node1 28285 1727204263.72982: Calling groups_inventory to load vars for managed-node1 28285 1727204263.72985: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.72999: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.73001: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.73004: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.73159: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001db 28285 1727204263.73165: WORKER PROCESS EXITING 28285 1727204263.73179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.73553: done with get_vars() 28285 1727204263.73562: done getting variables 28285 1727204263.73869: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.038) 0:00:04.548 ***** 28285 1727204263.73897: entering _queue_task() for managed-node1/package 28285 1727204263.73899: Creating lock for package 28285 1727204263.74568: worker is 1 (out of 1 available) 28285 1727204263.74588: exiting _queue_task() for managed-node1/package 28285 1727204263.74601: done queuing things up, now waiting for results queue to drain 28285 1727204263.74602: waiting for pending results... 28285 1727204263.75413: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 28285 1727204263.75541: in run() - task 0affcd87-79f5-57a1-d976-0000000001dc 28285 1727204263.75706: variable 'ansible_search_path' from source: unknown 28285 1727204263.75714: variable 'ansible_search_path' from source: unknown 28285 1727204263.75757: calling self._execute() 28285 1727204263.75956: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.75969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.75983: variable 'omit' from source: magic vars 28285 1727204263.76815: variable 'ansible_distribution' from source: facts 28285 1727204263.76838: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28285 1727204263.77102: variable 'ansible_distribution_major_version' from source: facts 28285 1727204263.77114: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28285 1727204263.77213: when evaluation is False, skipping this task 28285 1727204263.77222: _execute() done 28285 1727204263.77230: dumping result to json 28285 1727204263.77236: done dumping result, returning 28285 1727204263.77246: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [0affcd87-79f5-57a1-d976-0000000001dc] 28285 1727204263.77260: sending task result for task 0affcd87-79f5-57a1-d976-0000000001dc skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28285 1727204263.77417: no more pending results, returning what we have 28285 1727204263.77422: results queue empty 28285 1727204263.77423: checking for any_errors_fatal 28285 1727204263.77433: done checking for any_errors_fatal 28285 1727204263.77434: checking for max_fail_percentage 28285 1727204263.77435: done checking for max_fail_percentage 28285 1727204263.77436: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.77437: done checking to see if all hosts have failed 28285 1727204263.77438: getting the remaining hosts for this loop 28285 1727204263.77439: done getting the remaining hosts for this loop 28285 1727204263.77443: getting the next task for host managed-node1 28285 1727204263.77450: done getting next task for host managed-node1 28285 1727204263.77453: ^ task is: TASK: Enable EPEL 7 28285 1727204263.77456: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.77459: getting variables 28285 1727204263.77461: in VariableManager get_vars() 28285 1727204263.77496: Calling all_inventory to load vars for managed-node1 28285 1727204263.77499: Calling groups_inventory to load vars for managed-node1 28285 1727204263.77503: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.77519: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.77523: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.77527: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.77694: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001dc 28285 1727204263.77698: WORKER PROCESS EXITING 28285 1727204263.77719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.77942: done with get_vars() 28285 1727204263.77959: done getting variables 28285 1727204263.78042: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.041) 0:00:04.590 ***** 28285 1727204263.78074: entering _queue_task() for managed-node1/command 28285 1727204263.78760: worker is 1 (out of 1 available) 28285 1727204263.78804: exiting _queue_task() for managed-node1/command 28285 1727204263.78817: done queuing things up, now waiting for results queue to drain 28285 1727204263.78818: waiting for pending results... 28285 1727204263.79783: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 28285 1727204263.79903: in run() - task 0affcd87-79f5-57a1-d976-0000000001dd 28285 1727204263.79956: variable 'ansible_search_path' from source: unknown 28285 1727204263.79972: variable 'ansible_search_path' from source: unknown 28285 1727204263.80018: calling self._execute() 28285 1727204263.80119: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.80135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.80151: variable 'omit' from source: magic vars 28285 1727204263.80547: variable 'ansible_distribution' from source: facts 28285 1727204263.80577: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28285 1727204263.80713: variable 'ansible_distribution_major_version' from source: facts 28285 1727204263.80724: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28285 1727204263.80732: when evaluation is False, skipping this task 28285 1727204263.80739: _execute() done 28285 1727204263.80745: dumping result to json 28285 1727204263.80756: done dumping result, returning 28285 1727204263.80770: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [0affcd87-79f5-57a1-d976-0000000001dd] 28285 1727204263.80787: sending task result for task 0affcd87-79f5-57a1-d976-0000000001dd 28285 1727204263.80895: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001dd skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28285 1727204263.80944: no more pending results, returning what we have 28285 1727204263.80948: results queue empty 28285 1727204263.80948: checking for any_errors_fatal 28285 1727204263.80955: done checking for any_errors_fatal 28285 1727204263.80955: checking for max_fail_percentage 28285 1727204263.80957: done checking for max_fail_percentage 28285 1727204263.80959: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.80959: done checking to see if all hosts have failed 28285 1727204263.80960: getting the remaining hosts for this loop 28285 1727204263.80962: done getting the remaining hosts for this loop 28285 1727204263.80968: getting the next task for host managed-node1 28285 1727204263.80975: done getting next task for host managed-node1 28285 1727204263.80978: ^ task is: TASK: Enable EPEL 8 28285 1727204263.80982: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.80985: getting variables 28285 1727204263.80987: in VariableManager get_vars() 28285 1727204263.81015: Calling all_inventory to load vars for managed-node1 28285 1727204263.81018: Calling groups_inventory to load vars for managed-node1 28285 1727204263.81021: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.81035: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.81038: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.81041: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.81246: WORKER PROCESS EXITING 28285 1727204263.81262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.81460: done with get_vars() 28285 1727204263.81474: done getting variables 28285 1727204263.81533: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.034) 0:00:04.625 ***** 28285 1727204263.81575: entering _queue_task() for managed-node1/command 28285 1727204263.82133: worker is 1 (out of 1 available) 28285 1727204263.82145: exiting _queue_task() for managed-node1/command 28285 1727204263.82158: done queuing things up, now waiting for results queue to drain 28285 1727204263.82160: waiting for pending results... 28285 1727204263.82460: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 28285 1727204263.82588: in run() - task 0affcd87-79f5-57a1-d976-0000000001de 28285 1727204263.82612: variable 'ansible_search_path' from source: unknown 28285 1727204263.82620: variable 'ansible_search_path' from source: unknown 28285 1727204263.82667: calling self._execute() 28285 1727204263.82761: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.82774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.82787: variable 'omit' from source: magic vars 28285 1727204263.83202: variable 'ansible_distribution' from source: facts 28285 1727204263.83220: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28285 1727204263.83375: variable 'ansible_distribution_major_version' from source: facts 28285 1727204263.83387: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 28285 1727204263.83395: when evaluation is False, skipping this task 28285 1727204263.83402: _execute() done 28285 1727204263.83409: dumping result to json 28285 1727204263.83416: done dumping result, returning 28285 1727204263.83425: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [0affcd87-79f5-57a1-d976-0000000001de] 28285 1727204263.83436: sending task result for task 0affcd87-79f5-57a1-d976-0000000001de 28285 1727204263.83559: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001de 28285 1727204263.83572: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 28285 1727204263.83626: no more pending results, returning what we have 28285 1727204263.83631: results queue empty 28285 1727204263.83632: checking for any_errors_fatal 28285 1727204263.83636: done checking for any_errors_fatal 28285 1727204263.83636: checking for max_fail_percentage 28285 1727204263.83639: done checking for max_fail_percentage 28285 1727204263.83640: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.83641: done checking to see if all hosts have failed 28285 1727204263.83641: getting the remaining hosts for this loop 28285 1727204263.83643: done getting the remaining hosts for this loop 28285 1727204263.83651: getting the next task for host managed-node1 28285 1727204263.83660: done getting next task for host managed-node1 28285 1727204263.83662: ^ task is: TASK: Enable EPEL 6 28285 1727204263.83669: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.83673: getting variables 28285 1727204263.83675: in VariableManager get_vars() 28285 1727204263.83705: Calling all_inventory to load vars for managed-node1 28285 1727204263.83708: Calling groups_inventory to load vars for managed-node1 28285 1727204263.83712: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.83726: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.83729: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.83733: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.83923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.84171: done with get_vars() 28285 1727204263.84182: done getting variables 28285 1727204263.84269: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.027) 0:00:04.653 ***** 28285 1727204263.84308: entering _queue_task() for managed-node1/copy 28285 1727204263.84838: worker is 1 (out of 1 available) 28285 1727204263.84852: exiting _queue_task() for managed-node1/copy 28285 1727204263.84868: done queuing things up, now waiting for results queue to drain 28285 1727204263.84872: waiting for pending results... 28285 1727204263.85120: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 28285 1727204263.85233: in run() - task 0affcd87-79f5-57a1-d976-0000000001e0 28285 1727204263.85251: variable 'ansible_search_path' from source: unknown 28285 1727204263.85259: variable 'ansible_search_path' from source: unknown 28285 1727204263.85307: calling self._execute() 28285 1727204263.85388: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.85401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.85417: variable 'omit' from source: magic vars 28285 1727204263.85805: variable 'ansible_distribution' from source: facts 28285 1727204263.85824: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 28285 1727204263.85991: variable 'ansible_distribution_major_version' from source: facts 28285 1727204263.86006: Evaluated conditional (ansible_distribution_major_version == '6'): False 28285 1727204263.86029: when evaluation is False, skipping this task 28285 1727204263.86035: _execute() done 28285 1727204263.86042: dumping result to json 28285 1727204263.86051: done dumping result, returning 28285 1727204263.86065: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [0affcd87-79f5-57a1-d976-0000000001e0] 28285 1727204263.86079: sending task result for task 0affcd87-79f5-57a1-d976-0000000001e0 28285 1727204263.86578: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001e0 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 28285 1727204263.86634: no more pending results, returning what we have 28285 1727204263.86638: results queue empty 28285 1727204263.86639: checking for any_errors_fatal 28285 1727204263.86646: done checking for any_errors_fatal 28285 1727204263.86646: checking for max_fail_percentage 28285 1727204263.86651: done checking for max_fail_percentage 28285 1727204263.86652: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.86653: done checking to see if all hosts have failed 28285 1727204263.86654: getting the remaining hosts for this loop 28285 1727204263.86655: done getting the remaining hosts for this loop 28285 1727204263.86660: getting the next task for host managed-node1 28285 1727204263.86670: done getting next task for host managed-node1 28285 1727204263.86673: ^ task is: TASK: Set network provider to 'initscripts' 28285 1727204263.86676: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.86680: getting variables 28285 1727204263.86682: in VariableManager get_vars() 28285 1727204263.86712: Calling all_inventory to load vars for managed-node1 28285 1727204263.86716: Calling groups_inventory to load vars for managed-node1 28285 1727204263.86720: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.86733: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.86737: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.86740: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.87004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.87231: done with get_vars() 28285 1727204263.87241: done getting variables 28285 1727204263.87427: WORKER PROCESS EXITING 28285 1727204263.87475: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'initscripts'] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethtool_features_initscripts.yml:12 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.031) 0:00:04.685 ***** 28285 1727204263.87508: entering _queue_task() for managed-node1/set_fact 28285 1727204263.88014: worker is 1 (out of 1 available) 28285 1727204263.88027: exiting _queue_task() for managed-node1/set_fact 28285 1727204263.88038: done queuing things up, now waiting for results queue to drain 28285 1727204263.88040: waiting for pending results... 28285 1727204263.89283: running TaskExecutor() for managed-node1/TASK: Set network provider to 'initscripts' 28285 1727204263.89402: in run() - task 0affcd87-79f5-57a1-d976-000000000007 28285 1727204263.89422: variable 'ansible_search_path' from source: unknown 28285 1727204263.89483: calling self._execute() 28285 1727204263.89565: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.89577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.89590: variable 'omit' from source: magic vars 28285 1727204263.89697: variable 'omit' from source: magic vars 28285 1727204263.89732: variable 'omit' from source: magic vars 28285 1727204263.89779: variable 'omit' from source: magic vars 28285 1727204263.89824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28285 1727204263.89871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28285 1727204263.89897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28285 1727204263.90185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204263.90201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28285 1727204263.90232: variable 'inventory_hostname' from source: host vars for 'managed-node1' 28285 1727204263.90241: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.90248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.90354: Set connection var ansible_shell_executable to /bin/sh 28285 1727204263.90370: Set connection var ansible_pipelining to False 28285 1727204263.90385: Set connection var ansible_timeout to 10 28285 1727204263.90396: Set connection var ansible_shell_type to sh 28285 1727204263.90406: Set connection var ansible_connection to ssh 28285 1727204263.90414: Set connection var ansible_module_compression to ZIP_DEFLATED 28285 1727204263.90439: variable 'ansible_shell_executable' from source: unknown 28285 1727204263.90446: variable 'ansible_connection' from source: unknown 28285 1727204263.90452: variable 'ansible_module_compression' from source: unknown 28285 1727204263.90458: variable 'ansible_shell_type' from source: unknown 28285 1727204263.90466: variable 'ansible_shell_executable' from source: unknown 28285 1727204263.90472: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.90479: variable 'ansible_pipelining' from source: unknown 28285 1727204263.90485: variable 'ansible_timeout' from source: unknown 28285 1727204263.90492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.90747: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 28285 1727204263.90763: variable 'omit' from source: magic vars 28285 1727204263.90774: starting attempt loop 28285 1727204263.90781: running the handler 28285 1727204263.90796: handler run complete 28285 1727204263.90839: attempt loop complete, returning result 28285 1727204263.90846: _execute() done 28285 1727204263.90940: dumping result to json 28285 1727204263.90947: done dumping result, returning 28285 1727204263.90958: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'initscripts' [0affcd87-79f5-57a1-d976-000000000007] 28285 1727204263.90971: sending task result for task 0affcd87-79f5-57a1-d976-000000000007 ok: [managed-node1] => { "ansible_facts": { "network_provider": "initscripts" }, "changed": false } 28285 1727204263.91193: no more pending results, returning what we have 28285 1727204263.91196: results queue empty 28285 1727204263.91197: checking for any_errors_fatal 28285 1727204263.91204: done checking for any_errors_fatal 28285 1727204263.91204: checking for max_fail_percentage 28285 1727204263.91206: done checking for max_fail_percentage 28285 1727204263.91207: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.91207: done checking to see if all hosts have failed 28285 1727204263.91208: getting the remaining hosts for this loop 28285 1727204263.91210: done getting the remaining hosts for this loop 28285 1727204263.91213: getting the next task for host managed-node1 28285 1727204263.91220: done getting next task for host managed-node1 28285 1727204263.91222: ^ task is: TASK: meta (flush_handlers) 28285 1727204263.91224: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.91228: getting variables 28285 1727204263.91230: in VariableManager get_vars() 28285 1727204263.91259: Calling all_inventory to load vars for managed-node1 28285 1727204263.91262: Calling groups_inventory to load vars for managed-node1 28285 1727204263.91267: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.91279: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.91282: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.91286: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.91457: done sending task result for task 0affcd87-79f5-57a1-d976-000000000007 28285 1727204263.91471: WORKER PROCESS EXITING 28285 1727204263.91491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.91724: done with get_vars() 28285 1727204263.91734: done getting variables 28285 1727204263.91809: in VariableManager get_vars() 28285 1727204263.91818: Calling all_inventory to load vars for managed-node1 28285 1727204263.91821: Calling groups_inventory to load vars for managed-node1 28285 1727204263.91823: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.91827: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.91829: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.91832: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.92090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.92663: done with get_vars() 28285 1727204263.92679: done queuing things up, now waiting for results queue to drain 28285 1727204263.92681: results queue empty 28285 1727204263.92681: checking for any_errors_fatal 28285 1727204263.92683: done checking for any_errors_fatal 28285 1727204263.92684: checking for max_fail_percentage 28285 1727204263.92685: done checking for max_fail_percentage 28285 1727204263.92686: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.92687: done checking to see if all hosts have failed 28285 1727204263.92688: getting the remaining hosts for this loop 28285 1727204263.92688: done getting the remaining hosts for this loop 28285 1727204263.92691: getting the next task for host managed-node1 28285 1727204263.92984: done getting next task for host managed-node1 28285 1727204263.92986: ^ task is: TASK: meta (flush_handlers) 28285 1727204263.92988: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.93000: getting variables 28285 1727204263.93001: in VariableManager get_vars() 28285 1727204263.93039: Calling all_inventory to load vars for managed-node1 28285 1727204263.93042: Calling groups_inventory to load vars for managed-node1 28285 1727204263.93044: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.93049: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.93051: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.93054: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.93200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.93358: done with get_vars() 28285 1727204263.93367: done getting variables 28285 1727204263.93408: in VariableManager get_vars() 28285 1727204263.93416: Calling all_inventory to load vars for managed-node1 28285 1727204263.93417: Calling groups_inventory to load vars for managed-node1 28285 1727204263.93419: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.93423: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.93425: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.93427: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.93567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.93750: done with get_vars() 28285 1727204263.93766: done queuing things up, now waiting for results queue to drain 28285 1727204263.93770: results queue empty 28285 1727204263.93770: checking for any_errors_fatal 28285 1727204263.93772: done checking for any_errors_fatal 28285 1727204263.93772: checking for max_fail_percentage 28285 1727204263.93774: done checking for max_fail_percentage 28285 1727204263.93775: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.93776: done checking to see if all hosts have failed 28285 1727204263.93776: getting the remaining hosts for this loop 28285 1727204263.93777: done getting the remaining hosts for this loop 28285 1727204263.93780: getting the next task for host managed-node1 28285 1727204263.93783: done getting next task for host managed-node1 28285 1727204263.93784: ^ task is: None 28285 1727204263.93785: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.93786: done queuing things up, now waiting for results queue to drain 28285 1727204263.93787: results queue empty 28285 1727204263.93788: checking for any_errors_fatal 28285 1727204263.93789: done checking for any_errors_fatal 28285 1727204263.93789: checking for max_fail_percentage 28285 1727204263.93790: done checking for max_fail_percentage 28285 1727204263.93791: checking to see if all hosts have failed and the running result is not ok 28285 1727204263.93791: done checking to see if all hosts have failed 28285 1727204263.93793: getting the next task for host managed-node1 28285 1727204263.93795: done getting next task for host managed-node1 28285 1727204263.93796: ^ task is: None 28285 1727204263.93797: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.93847: in VariableManager get_vars() 28285 1727204263.93888: done with get_vars() 28285 1727204263.93895: in VariableManager get_vars() 28285 1727204263.93916: done with get_vars() 28285 1727204263.93921: variable 'omit' from source: magic vars 28285 1727204263.93952: in VariableManager get_vars() 28285 1727204263.93982: done with get_vars() 28285 1727204263.94006: variable 'omit' from source: magic vars PLAY [Play for testing ethtool features settings] ****************************** 28285 1727204263.95247: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28285 1727204263.96258: getting the remaining hosts for this loop 28285 1727204263.96261: done getting the remaining hosts for this loop 28285 1727204263.96264: getting the next task for host managed-node1 28285 1727204263.96267: done getting next task for host managed-node1 28285 1727204263.96269: ^ task is: TASK: Gathering Facts 28285 1727204263.96272: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204263.96274: getting variables 28285 1727204263.96275: in VariableManager get_vars() 28285 1727204263.96298: Calling all_inventory to load vars for managed-node1 28285 1727204263.96300: Calling groups_inventory to load vars for managed-node1 28285 1727204263.96302: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204263.96308: Calling all_plugins_play to load vars for managed-node1 28285 1727204263.96325: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204263.96330: Calling groups_plugins_play to load vars for managed-node1 28285 1727204263.96484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204263.96915: done with get_vars() 28285 1727204263.96924: done getting variables 28285 1727204263.96976: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:3 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.094) 0:00:04.780 ***** 28285 1727204263.97009: entering _queue_task() for managed-node1/gather_facts 28285 1727204263.97532: worker is 1 (out of 1 available) 28285 1727204263.97545: exiting _queue_task() for managed-node1/gather_facts 28285 1727204263.97675: done queuing things up, now waiting for results queue to drain 28285 1727204263.97677: waiting for pending results... 28285 1727204263.98516: running TaskExecutor() for managed-node1/TASK: Gathering Facts 28285 1727204263.98745: in run() - task 0affcd87-79f5-57a1-d976-000000000206 28285 1727204263.98886: variable 'ansible_search_path' from source: unknown 28285 1727204263.98930: calling self._execute() 28285 1727204263.99035: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204263.99205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204263.99221: variable 'omit' from source: magic vars 28285 1727204263.99720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.03093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.03181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.03240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.03286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.03319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.03423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.03473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.03503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.03561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.03586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.03736: variable 'ansible_distribution' from source: facts 28285 1727204264.03750: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.03785: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.03794: when evaluation is False, skipping this task 28285 1727204264.03801: _execute() done 28285 1727204264.03809: dumping result to json 28285 1727204264.03816: done dumping result, returning 28285 1727204264.03828: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-57a1-d976-000000000206] 28285 1727204264.03839: sending task result for task 0affcd87-79f5-57a1-d976-000000000206 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.03995: no more pending results, returning what we have 28285 1727204264.04000: results queue empty 28285 1727204264.04001: checking for any_errors_fatal 28285 1727204264.04002: done checking for any_errors_fatal 28285 1727204264.04003: checking for max_fail_percentage 28285 1727204264.04005: done checking for max_fail_percentage 28285 1727204264.04006: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.04007: done checking to see if all hosts have failed 28285 1727204264.04007: getting the remaining hosts for this loop 28285 1727204264.04010: done getting the remaining hosts for this loop 28285 1727204264.04014: getting the next task for host managed-node1 28285 1727204264.04020: done getting next task for host managed-node1 28285 1727204264.04023: ^ task is: TASK: meta (flush_handlers) 28285 1727204264.04025: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.04029: getting variables 28285 1727204264.04031: in VariableManager get_vars() 28285 1727204264.04095: Calling all_inventory to load vars for managed-node1 28285 1727204264.04098: Calling groups_inventory to load vars for managed-node1 28285 1727204264.04100: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.04111: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.04113: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.04115: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.04705: done sending task result for task 0affcd87-79f5-57a1-d976-000000000206 28285 1727204264.04708: WORKER PROCESS EXITING 28285 1727204264.04729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.04952: done with get_vars() 28285 1727204264.04966: done getting variables 28285 1727204264.05058: in VariableManager get_vars() 28285 1727204264.05163: Calling all_inventory to load vars for managed-node1 28285 1727204264.05167: Calling groups_inventory to load vars for managed-node1 28285 1727204264.05170: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.05174: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.05176: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.05179: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.05317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.05508: done with get_vars() 28285 1727204264.05521: done queuing things up, now waiting for results queue to drain 28285 1727204264.05522: results queue empty 28285 1727204264.05523: checking for any_errors_fatal 28285 1727204264.05526: done checking for any_errors_fatal 28285 1727204264.05527: checking for max_fail_percentage 28285 1727204264.05528: done checking for max_fail_percentage 28285 1727204264.05528: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.05529: done checking to see if all hosts have failed 28285 1727204264.05530: getting the remaining hosts for this loop 28285 1727204264.05531: done getting the remaining hosts for this loop 28285 1727204264.05533: getting the next task for host managed-node1 28285 1727204264.05537: done getting next task for host managed-node1 28285 1727204264.05539: ^ task is: TASK: Show playbook name 28285 1727204264.05540: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.05542: getting variables 28285 1727204264.05543: in VariableManager get_vars() 28285 1727204264.05560: Calling all_inventory to load vars for managed-node1 28285 1727204264.05562: Calling groups_inventory to load vars for managed-node1 28285 1727204264.05566: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.05570: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.05581: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.05584: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.05726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.06083: done with get_vars() 28285 1727204264.06091: done getting variables 28285 1727204264.06353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:9 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.093) 0:00:04.873 ***** 28285 1727204264.06384: entering _queue_task() for managed-node1/debug 28285 1727204264.06386: Creating lock for debug 28285 1727204264.06862: worker is 1 (out of 1 available) 28285 1727204264.06880: exiting _queue_task() for managed-node1/debug 28285 1727204264.06892: done queuing things up, now waiting for results queue to drain 28285 1727204264.06894: waiting for pending results... 28285 1727204264.07146: running TaskExecutor() for managed-node1/TASK: Show playbook name 28285 1727204264.07249: in run() - task 0affcd87-79f5-57a1-d976-00000000000b 28285 1727204264.07272: variable 'ansible_search_path' from source: unknown 28285 1727204264.07319: calling self._execute() 28285 1727204264.07408: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.07424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.07441: variable 'omit' from source: magic vars 28285 1727204264.07900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.10328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.10422: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.10482: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.10549: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.10613: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.10771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.10809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.10842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.10889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.10916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.11081: variable 'ansible_distribution' from source: facts 28285 1727204264.11092: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.11113: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.11130: when evaluation is False, skipping this task 28285 1727204264.11140: _execute() done 28285 1727204264.11148: dumping result to json 28285 1727204264.11155: done dumping result, returning 28285 1727204264.11171: done running TaskExecutor() for managed-node1/TASK: Show playbook name [0affcd87-79f5-57a1-d976-00000000000b] 28285 1727204264.11181: sending task result for task 0affcd87-79f5-57a1-d976-00000000000b skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204264.11332: no more pending results, returning what we have 28285 1727204264.11336: results queue empty 28285 1727204264.11337: checking for any_errors_fatal 28285 1727204264.11338: done checking for any_errors_fatal 28285 1727204264.11339: checking for max_fail_percentage 28285 1727204264.11341: done checking for max_fail_percentage 28285 1727204264.11342: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.11343: done checking to see if all hosts have failed 28285 1727204264.11343: getting the remaining hosts for this loop 28285 1727204264.11345: done getting the remaining hosts for this loop 28285 1727204264.11350: getting the next task for host managed-node1 28285 1727204264.11358: done getting next task for host managed-node1 28285 1727204264.11361: ^ task is: TASK: INIT: Ethtool feeatures tests 28285 1727204264.11365: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.11369: getting variables 28285 1727204264.11371: in VariableManager get_vars() 28285 1727204264.11433: Calling all_inventory to load vars for managed-node1 28285 1727204264.11436: Calling groups_inventory to load vars for managed-node1 28285 1727204264.11439: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.11450: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.11452: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.11455: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.11645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.11854: done with get_vars() 28285 1727204264.11869: done getting variables 28285 1727204264.11935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204264.12109: done sending task result for task 0affcd87-79f5-57a1-d976-00000000000b 28285 1727204264.12112: WORKER PROCESS EXITING TASK [INIT: Ethtool feeatures tests] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:15 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.057) 0:00:04.931 ***** 28285 1727204264.12129: entering _queue_task() for managed-node1/debug 28285 1727204264.12798: worker is 1 (out of 1 available) 28285 1727204264.12809: exiting _queue_task() for managed-node1/debug 28285 1727204264.12820: done queuing things up, now waiting for results queue to drain 28285 1727204264.12822: waiting for pending results... 28285 1727204264.13902: running TaskExecutor() for managed-node1/TASK: INIT: Ethtool feeatures tests 28285 1727204264.14031: in run() - task 0affcd87-79f5-57a1-d976-00000000000c 28285 1727204264.14055: variable 'ansible_search_path' from source: unknown 28285 1727204264.14108: calling self._execute() 28285 1727204264.14213: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.14224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.14243: variable 'omit' from source: magic vars 28285 1727204264.14690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.19168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.19268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.19409: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.19562: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.19598: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.19796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.19828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.19865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.20005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.20024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.20305: variable 'ansible_distribution' from source: facts 28285 1727204264.20406: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.20430: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.20438: when evaluation is False, skipping this task 28285 1727204264.20445: _execute() done 28285 1727204264.20451: dumping result to json 28285 1727204264.20457: done dumping result, returning 28285 1727204264.20471: done running TaskExecutor() for managed-node1/TASK: INIT: Ethtool feeatures tests [0affcd87-79f5-57a1-d976-00000000000c] 28285 1727204264.20480: sending task result for task 0affcd87-79f5-57a1-d976-00000000000c skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204264.20652: no more pending results, returning what we have 28285 1727204264.20656: results queue empty 28285 1727204264.20657: checking for any_errors_fatal 28285 1727204264.20662: done checking for any_errors_fatal 28285 1727204264.20663: checking for max_fail_percentage 28285 1727204264.20666: done checking for max_fail_percentage 28285 1727204264.20667: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.20668: done checking to see if all hosts have failed 28285 1727204264.20669: getting the remaining hosts for this loop 28285 1727204264.20671: done getting the remaining hosts for this loop 28285 1727204264.20675: getting the next task for host managed-node1 28285 1727204264.20682: done getting next task for host managed-node1 28285 1727204264.20685: ^ task is: TASK: Include the task 'show_interfaces.yml' 28285 1727204264.20687: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.20690: getting variables 28285 1727204264.20692: in VariableManager get_vars() 28285 1727204264.20747: Calling all_inventory to load vars for managed-node1 28285 1727204264.20749: Calling groups_inventory to load vars for managed-node1 28285 1727204264.20752: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.20765: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.20768: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.20771: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.20986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.21194: done with get_vars() 28285 1727204264.21205: done getting variables 28285 1727204264.21785: done sending task result for task 0affcd87-79f5-57a1-d976-00000000000c 28285 1727204264.21788: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:18 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.096) 0:00:05.028 ***** 28285 1727204264.21826: entering _queue_task() for managed-node1/include_tasks 28285 1727204264.22179: worker is 1 (out of 1 available) 28285 1727204264.22191: exiting _queue_task() for managed-node1/include_tasks 28285 1727204264.22203: done queuing things up, now waiting for results queue to drain 28285 1727204264.22205: waiting for pending results... 28285 1727204264.22754: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 28285 1727204264.22856: in run() - task 0affcd87-79f5-57a1-d976-00000000000d 28285 1727204264.22877: variable 'ansible_search_path' from source: unknown 28285 1727204264.22918: calling self._execute() 28285 1727204264.23028: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.23046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.23063: variable 'omit' from source: magic vars 28285 1727204264.23561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.26899: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.27009: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.27201: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.27242: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.27293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.27468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.27544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.27721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.27771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.27803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.28263: variable 'ansible_distribution' from source: facts 28285 1727204264.28284: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.28316: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.28324: when evaluation is False, skipping this task 28285 1727204264.28346: _execute() done 28285 1727204264.28372: dumping result to json 28285 1727204264.28383: done dumping result, returning 28285 1727204264.28415: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-57a1-d976-00000000000d] 28285 1727204264.28442: sending task result for task 0affcd87-79f5-57a1-d976-00000000000d skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.28716: no more pending results, returning what we have 28285 1727204264.28719: results queue empty 28285 1727204264.28720: checking for any_errors_fatal 28285 1727204264.28725: done checking for any_errors_fatal 28285 1727204264.28726: checking for max_fail_percentage 28285 1727204264.28727: done checking for max_fail_percentage 28285 1727204264.28728: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.28729: done checking to see if all hosts have failed 28285 1727204264.28729: getting the remaining hosts for this loop 28285 1727204264.28731: done getting the remaining hosts for this loop 28285 1727204264.28735: getting the next task for host managed-node1 28285 1727204264.28742: done getting next task for host managed-node1 28285 1727204264.28745: ^ task is: TASK: Include the task 'manage_test_interface.yml' 28285 1727204264.28747: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.28750: getting variables 28285 1727204264.28752: in VariableManager get_vars() 28285 1727204264.28808: Calling all_inventory to load vars for managed-node1 28285 1727204264.28812: Calling groups_inventory to load vars for managed-node1 28285 1727204264.28815: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.28829: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.28832: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.28835: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.28989: done sending task result for task 0affcd87-79f5-57a1-d976-00000000000d 28285 1727204264.28992: WORKER PROCESS EXITING 28285 1727204264.29015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.29215: done with get_vars() 28285 1727204264.29227: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:20 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.074) 0:00:05.103 ***** 28285 1727204264.29320: entering _queue_task() for managed-node1/include_tasks 28285 1727204264.29930: worker is 1 (out of 1 available) 28285 1727204264.29941: exiting _queue_task() for managed-node1/include_tasks 28285 1727204264.30069: done queuing things up, now waiting for results queue to drain 28285 1727204264.30071: waiting for pending results... 28285 1727204264.31119: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 28285 1727204264.31320: in run() - task 0affcd87-79f5-57a1-d976-00000000000e 28285 1727204264.31339: variable 'ansible_search_path' from source: unknown 28285 1727204264.31385: calling self._execute() 28285 1727204264.31525: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.31539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.31557: variable 'omit' from source: magic vars 28285 1727204264.32029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.37860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.38060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.38105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.38154: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.38255: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.38483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.38571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.38659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.38861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.38946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.39216: variable 'ansible_distribution' from source: facts 28285 1727204264.39376: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.39401: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.39409: when evaluation is False, skipping this task 28285 1727204264.39416: _execute() done 28285 1727204264.39424: dumping result to json 28285 1727204264.39431: done dumping result, returning 28285 1727204264.39443: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-57a1-d976-00000000000e] 28285 1727204264.39457: sending task result for task 0affcd87-79f5-57a1-d976-00000000000e 28285 1727204264.39588: done sending task result for task 0affcd87-79f5-57a1-d976-00000000000e skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.39640: no more pending results, returning what we have 28285 1727204264.39644: results queue empty 28285 1727204264.39645: checking for any_errors_fatal 28285 1727204264.39653: done checking for any_errors_fatal 28285 1727204264.39654: checking for max_fail_percentage 28285 1727204264.39656: done checking for max_fail_percentage 28285 1727204264.39657: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.39658: done checking to see if all hosts have failed 28285 1727204264.39659: getting the remaining hosts for this loop 28285 1727204264.39661: done getting the remaining hosts for this loop 28285 1727204264.39667: getting the next task for host managed-node1 28285 1727204264.39674: done getting next task for host managed-node1 28285 1727204264.39676: ^ task is: TASK: Include the task 'assert_device_present.yml' 28285 1727204264.39679: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.39682: getting variables 28285 1727204264.39685: in VariableManager get_vars() 28285 1727204264.39743: Calling all_inventory to load vars for managed-node1 28285 1727204264.39746: Calling groups_inventory to load vars for managed-node1 28285 1727204264.39751: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.39767: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.39770: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.39773: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.40010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.40219: done with get_vars() 28285 1727204264.40230: done getting variables 28285 1727204264.40529: WORKER PROCESS EXITING TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:24 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.112) 0:00:05.215 ***** 28285 1727204264.40566: entering _queue_task() for managed-node1/include_tasks 28285 1727204264.41305: worker is 1 (out of 1 available) 28285 1727204264.41341: exiting _queue_task() for managed-node1/include_tasks 28285 1727204264.41356: done queuing things up, now waiting for results queue to drain 28285 1727204264.41358: waiting for pending results... 28285 1727204264.42215: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' 28285 1727204264.42480: in run() - task 0affcd87-79f5-57a1-d976-00000000000f 28285 1727204264.42507: variable 'ansible_search_path' from source: unknown 28285 1727204264.42554: calling self._execute() 28285 1727204264.42806: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.42825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.42841: variable 'omit' from source: magic vars 28285 1727204264.44865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.50781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.51634: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.51688: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.51879: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.51972: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.52174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.52207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.52237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.52313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.52552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.52744: variable 'ansible_distribution' from source: facts 28285 1727204264.52878: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.52993: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.53000: when evaluation is False, skipping this task 28285 1727204264.53008: _execute() done 28285 1727204264.53014: dumping result to json 28285 1727204264.53021: done dumping result, returning 28285 1727204264.53033: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-57a1-d976-00000000000f] 28285 1727204264.53044: sending task result for task 0affcd87-79f5-57a1-d976-00000000000f 28285 1727204264.53166: done sending task result for task 0affcd87-79f5-57a1-d976-00000000000f skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.53235: no more pending results, returning what we have 28285 1727204264.53239: results queue empty 28285 1727204264.53239: checking for any_errors_fatal 28285 1727204264.53244: done checking for any_errors_fatal 28285 1727204264.53245: checking for max_fail_percentage 28285 1727204264.53246: done checking for max_fail_percentage 28285 1727204264.53247: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.53251: done checking to see if all hosts have failed 28285 1727204264.53252: getting the remaining hosts for this loop 28285 1727204264.53253: done getting the remaining hosts for this loop 28285 1727204264.53258: getting the next task for host managed-node1 28285 1727204264.53266: done getting next task for host managed-node1 28285 1727204264.53269: ^ task is: TASK: Install ethtool (test dependency) 28285 1727204264.53271: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.53275: getting variables 28285 1727204264.53277: in VariableManager get_vars() 28285 1727204264.53333: Calling all_inventory to load vars for managed-node1 28285 1727204264.53336: Calling groups_inventory to load vars for managed-node1 28285 1727204264.53339: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.53353: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.53356: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.53360: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.53534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.53747: done with get_vars() 28285 1727204264.53761: done getting variables 28285 1727204264.53830: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install ethtool (test dependency)] *************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:26 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.132) 0:00:05.348 ***** 28285 1727204264.53869: entering _queue_task() for managed-node1/package 28285 1727204264.53889: WORKER PROCESS EXITING 28285 1727204264.55178: worker is 1 (out of 1 available) 28285 1727204264.55189: exiting _queue_task() for managed-node1/package 28285 1727204264.55424: done queuing things up, now waiting for results queue to drain 28285 1727204264.55426: waiting for pending results... 28285 1727204264.56227: running TaskExecutor() for managed-node1/TASK: Install ethtool (test dependency) 28285 1727204264.56443: in run() - task 0affcd87-79f5-57a1-d976-000000000010 28285 1727204264.56463: variable 'ansible_search_path' from source: unknown 28285 1727204264.56521: calling self._execute() 28285 1727204264.56726: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.56742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.56759: variable 'omit' from source: magic vars 28285 1727204264.57834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.65371: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.65604: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.65669: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.65791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.65821: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.65942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.66106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.66138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.66229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.66312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.66697: variable 'ansible_distribution' from source: facts 28285 1727204264.66859: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.66886: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.66895: when evaluation is False, skipping this task 28285 1727204264.66902: _execute() done 28285 1727204264.66908: dumping result to json 28285 1727204264.66915: done dumping result, returning 28285 1727204264.66926: done running TaskExecutor() for managed-node1/TASK: Install ethtool (test dependency) [0affcd87-79f5-57a1-d976-000000000010] 28285 1727204264.66936: sending task result for task 0affcd87-79f5-57a1-d976-000000000010 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.67110: no more pending results, returning what we have 28285 1727204264.67114: results queue empty 28285 1727204264.67115: checking for any_errors_fatal 28285 1727204264.67121: done checking for any_errors_fatal 28285 1727204264.67122: checking for max_fail_percentage 28285 1727204264.67124: done checking for max_fail_percentage 28285 1727204264.67125: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.67125: done checking to see if all hosts have failed 28285 1727204264.67126: getting the remaining hosts for this loop 28285 1727204264.67128: done getting the remaining hosts for this loop 28285 1727204264.67132: getting the next task for host managed-node1 28285 1727204264.67140: done getting next task for host managed-node1 28285 1727204264.67143: ^ task is: TASK: TEST: I can create a profile without changing the ethtool features. 28285 1727204264.67145: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.67151: getting variables 28285 1727204264.67152: in VariableManager get_vars() 28285 1727204264.67211: Calling all_inventory to load vars for managed-node1 28285 1727204264.67215: Calling groups_inventory to load vars for managed-node1 28285 1727204264.67218: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.67231: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.67234: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.67237: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.67491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.67927: done with get_vars() 28285 1727204264.67941: done getting variables 28285 1727204264.67979: done sending task result for task 0affcd87-79f5-57a1-d976-000000000010 28285 1727204264.67983: WORKER PROCESS EXITING 28285 1727204264.68245: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can create a profile without changing the ethtool features.] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:34 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.144) 0:00:05.492 ***** 28285 1727204264.68280: entering _queue_task() for managed-node1/debug 28285 1727204264.68959: worker is 1 (out of 1 available) 28285 1727204264.69080: exiting _queue_task() for managed-node1/debug 28285 1727204264.69098: done queuing things up, now waiting for results queue to drain 28285 1727204264.69100: waiting for pending results... 28285 1727204264.69656: running TaskExecutor() for managed-node1/TASK: TEST: I can create a profile without changing the ethtool features. 28285 1727204264.69768: in run() - task 0affcd87-79f5-57a1-d976-000000000012 28285 1727204264.69789: variable 'ansible_search_path' from source: unknown 28285 1727204264.69830: calling self._execute() 28285 1727204264.69931: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.69943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.69959: variable 'omit' from source: magic vars 28285 1727204264.70582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.73659: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.73741: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.73786: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.73846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.73881: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.73970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.74005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.74039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.74095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.74116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.74272: variable 'ansible_distribution' from source: facts 28285 1727204264.74284: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.74305: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.74313: when evaluation is False, skipping this task 28285 1727204264.74319: _execute() done 28285 1727204264.74324: dumping result to json 28285 1727204264.74330: done dumping result, returning 28285 1727204264.74341: done running TaskExecutor() for managed-node1/TASK: TEST: I can create a profile without changing the ethtool features. [0affcd87-79f5-57a1-d976-000000000012] 28285 1727204264.74350: sending task result for task 0affcd87-79f5-57a1-d976-000000000012 28285 1727204264.74461: done sending task result for task 0affcd87-79f5-57a1-d976-000000000012 28285 1727204264.74469: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204264.74522: no more pending results, returning what we have 28285 1727204264.74526: results queue empty 28285 1727204264.74527: checking for any_errors_fatal 28285 1727204264.74533: done checking for any_errors_fatal 28285 1727204264.74534: checking for max_fail_percentage 28285 1727204264.74536: done checking for max_fail_percentage 28285 1727204264.74536: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.74537: done checking to see if all hosts have failed 28285 1727204264.74538: getting the remaining hosts for this loop 28285 1727204264.74540: done getting the remaining hosts for this loop 28285 1727204264.74544: getting the next task for host managed-node1 28285 1727204264.74550: done getting next task for host managed-node1 28285 1727204264.74553: ^ task is: TASK: Get current device features 28285 1727204264.74555: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.74558: getting variables 28285 1727204264.74560: in VariableManager get_vars() 28285 1727204264.74618: Calling all_inventory to load vars for managed-node1 28285 1727204264.74621: Calling groups_inventory to load vars for managed-node1 28285 1727204264.74624: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.74635: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.74638: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.74641: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.74816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.75025: done with get_vars() 28285 1727204264.75037: done getting variables 28285 1727204264.75107: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get current device features] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:38 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.068) 0:00:05.561 ***** 28285 1727204264.75142: entering _queue_task() for managed-node1/command 28285 1727204264.75699: worker is 1 (out of 1 available) 28285 1727204264.75713: exiting _queue_task() for managed-node1/command 28285 1727204264.75731: done queuing things up, now waiting for results queue to drain 28285 1727204264.75733: waiting for pending results... 28285 1727204264.76099: running TaskExecutor() for managed-node1/TASK: Get current device features 28285 1727204264.76231: in run() - task 0affcd87-79f5-57a1-d976-000000000013 28285 1727204264.76252: variable 'ansible_search_path' from source: unknown 28285 1727204264.76305: calling self._execute() 28285 1727204264.76402: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.76413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.76428: variable 'omit' from source: magic vars 28285 1727204264.76954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.79892: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.79974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.80030: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.80072: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.80107: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.80189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.80230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.80267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.80318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.80342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.80486: variable 'ansible_distribution' from source: facts 28285 1727204264.80496: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.80517: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.80523: when evaluation is False, skipping this task 28285 1727204264.80528: _execute() done 28285 1727204264.80533: dumping result to json 28285 1727204264.80550: done dumping result, returning 28285 1727204264.80566: done running TaskExecutor() for managed-node1/TASK: Get current device features [0affcd87-79f5-57a1-d976-000000000013] 28285 1727204264.80577: sending task result for task 0affcd87-79f5-57a1-d976-000000000013 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.80721: no more pending results, returning what we have 28285 1727204264.80726: results queue empty 28285 1727204264.80727: checking for any_errors_fatal 28285 1727204264.80731: done checking for any_errors_fatal 28285 1727204264.80732: checking for max_fail_percentage 28285 1727204264.80734: done checking for max_fail_percentage 28285 1727204264.80735: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.80736: done checking to see if all hosts have failed 28285 1727204264.80736: getting the remaining hosts for this loop 28285 1727204264.80738: done getting the remaining hosts for this loop 28285 1727204264.80742: getting the next task for host managed-node1 28285 1727204264.80750: done getting next task for host managed-node1 28285 1727204264.80756: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204264.80759: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.80777: getting variables 28285 1727204264.80780: in VariableManager get_vars() 28285 1727204264.80835: Calling all_inventory to load vars for managed-node1 28285 1727204264.80838: Calling groups_inventory to load vars for managed-node1 28285 1727204264.80841: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.80851: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.80853: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.80856: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.81030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.81312: done with get_vars() 28285 1727204264.81323: done getting variables 28285 1727204264.81515: done sending task result for task 0affcd87-79f5-57a1-d976-000000000013 28285 1727204264.81518: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.064) 0:00:05.625 ***** 28285 1727204264.81580: entering _queue_task() for managed-node1/include_tasks 28285 1727204264.82010: worker is 1 (out of 1 available) 28285 1727204264.82021: exiting _queue_task() for managed-node1/include_tasks 28285 1727204264.82033: done queuing things up, now waiting for results queue to drain 28285 1727204264.82035: waiting for pending results... 28285 1727204264.82289: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204264.82424: in run() - task 0affcd87-79f5-57a1-d976-00000000001b 28285 1727204264.82443: variable 'ansible_search_path' from source: unknown 28285 1727204264.82449: variable 'ansible_search_path' from source: unknown 28285 1727204264.82497: calling self._execute() 28285 1727204264.82576: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.82593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.82607: variable 'omit' from source: magic vars 28285 1727204264.83046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.85983: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.86049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.86112: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.86150: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.86189: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.86269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.86311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.86343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.86396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.86417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.86567: variable 'ansible_distribution' from source: facts 28285 1727204264.86579: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.86601: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.86612: when evaluation is False, skipping this task 28285 1727204264.86624: _execute() done 28285 1727204264.86630: dumping result to json 28285 1727204264.86636: done dumping result, returning 28285 1727204264.86647: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-57a1-d976-00000000001b] 28285 1727204264.86657: sending task result for task 0affcd87-79f5-57a1-d976-00000000001b skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.86803: no more pending results, returning what we have 28285 1727204264.86806: results queue empty 28285 1727204264.86807: checking for any_errors_fatal 28285 1727204264.86812: done checking for any_errors_fatal 28285 1727204264.86813: checking for max_fail_percentage 28285 1727204264.86815: done checking for max_fail_percentage 28285 1727204264.86816: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.86817: done checking to see if all hosts have failed 28285 1727204264.86818: getting the remaining hosts for this loop 28285 1727204264.86819: done getting the remaining hosts for this loop 28285 1727204264.86824: getting the next task for host managed-node1 28285 1727204264.86831: done getting next task for host managed-node1 28285 1727204264.86836: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204264.86839: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.86853: getting variables 28285 1727204264.86855: in VariableManager get_vars() 28285 1727204264.86913: Calling all_inventory to load vars for managed-node1 28285 1727204264.86917: Calling groups_inventory to load vars for managed-node1 28285 1727204264.86919: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.86930: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.86932: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.86935: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.87123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.87355: done with get_vars() 28285 1727204264.87369: done getting variables 28285 1727204264.87515: done sending task result for task 0affcd87-79f5-57a1-d976-00000000001b 28285 1727204264.87518: WORKER PROCESS EXITING 28285 1727204264.87555: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.060) 0:00:05.685 ***** 28285 1727204264.87596: entering _queue_task() for managed-node1/debug 28285 1727204264.88053: worker is 1 (out of 1 available) 28285 1727204264.88066: exiting _queue_task() for managed-node1/debug 28285 1727204264.88077: done queuing things up, now waiting for results queue to drain 28285 1727204264.88079: waiting for pending results... 28285 1727204264.88328: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204264.88470: in run() - task 0affcd87-79f5-57a1-d976-00000000001c 28285 1727204264.88493: variable 'ansible_search_path' from source: unknown 28285 1727204264.88501: variable 'ansible_search_path' from source: unknown 28285 1727204264.88546: calling self._execute() 28285 1727204264.88634: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.88646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.88661: variable 'omit' from source: magic vars 28285 1727204264.89081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.92171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.92251: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.92295: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.92336: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.92376: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.92465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.92499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.92532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.92581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.92602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.92749: variable 'ansible_distribution' from source: facts 28285 1727204264.92758: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.92788: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.92796: when evaluation is False, skipping this task 28285 1727204264.92803: _execute() done 28285 1727204264.92809: dumping result to json 28285 1727204264.92817: done dumping result, returning 28285 1727204264.92831: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-57a1-d976-00000000001c] 28285 1727204264.92843: sending task result for task 0affcd87-79f5-57a1-d976-00000000001c 28285 1727204264.92957: done sending task result for task 0affcd87-79f5-57a1-d976-00000000001c skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204264.93006: no more pending results, returning what we have 28285 1727204264.93009: results queue empty 28285 1727204264.93010: checking for any_errors_fatal 28285 1727204264.93014: done checking for any_errors_fatal 28285 1727204264.93015: checking for max_fail_percentage 28285 1727204264.93017: done checking for max_fail_percentage 28285 1727204264.93017: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.93018: done checking to see if all hosts have failed 28285 1727204264.93019: getting the remaining hosts for this loop 28285 1727204264.93021: done getting the remaining hosts for this loop 28285 1727204264.93025: getting the next task for host managed-node1 28285 1727204264.93031: done getting next task for host managed-node1 28285 1727204264.93035: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204264.93037: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.93052: getting variables 28285 1727204264.93054: in VariableManager get_vars() 28285 1727204264.93112: Calling all_inventory to load vars for managed-node1 28285 1727204264.93115: Calling groups_inventory to load vars for managed-node1 28285 1727204264.93117: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.93127: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.93129: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.93132: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.93489: WORKER PROCESS EXITING 28285 1727204264.93515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.93726: done with get_vars() 28285 1727204264.93736: done getting variables 28285 1727204264.93819: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.062) 0:00:05.748 ***** 28285 1727204264.93853: entering _queue_task() for managed-node1/fail 28285 1727204264.93855: Creating lock for fail 28285 1727204264.94121: worker is 1 (out of 1 available) 28285 1727204264.94133: exiting _queue_task() for managed-node1/fail 28285 1727204264.94144: done queuing things up, now waiting for results queue to drain 28285 1727204264.94149: waiting for pending results... 28285 1727204264.94411: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204264.94559: in run() - task 0affcd87-79f5-57a1-d976-00000000001d 28285 1727204264.94579: variable 'ansible_search_path' from source: unknown 28285 1727204264.94605: variable 'ansible_search_path' from source: unknown 28285 1727204264.94646: calling self._execute() 28285 1727204264.94773: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204264.94785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204264.94800: variable 'omit' from source: magic vars 28285 1727204264.95375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204264.98071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204264.98159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204264.98211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204264.98249: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204264.98282: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204264.98369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204264.98404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204264.98443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204264.98491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204264.98512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204264.98666: variable 'ansible_distribution' from source: facts 28285 1727204264.98679: variable 'ansible_distribution_major_version' from source: facts 28285 1727204264.98700: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204264.98707: when evaluation is False, skipping this task 28285 1727204264.98714: _execute() done 28285 1727204264.98720: dumping result to json 28285 1727204264.98726: done dumping result, returning 28285 1727204264.98744: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-57a1-d976-00000000001d] 28285 1727204264.98756: sending task result for task 0affcd87-79f5-57a1-d976-00000000001d skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204264.98911: no more pending results, returning what we have 28285 1727204264.98914: results queue empty 28285 1727204264.98915: checking for any_errors_fatal 28285 1727204264.98920: done checking for any_errors_fatal 28285 1727204264.98921: checking for max_fail_percentage 28285 1727204264.98923: done checking for max_fail_percentage 28285 1727204264.98923: checking to see if all hosts have failed and the running result is not ok 28285 1727204264.98924: done checking to see if all hosts have failed 28285 1727204264.98925: getting the remaining hosts for this loop 28285 1727204264.98927: done getting the remaining hosts for this loop 28285 1727204264.98931: getting the next task for host managed-node1 28285 1727204264.98938: done getting next task for host managed-node1 28285 1727204264.98942: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204264.98945: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204264.98959: getting variables 28285 1727204264.98960: in VariableManager get_vars() 28285 1727204264.99019: Calling all_inventory to load vars for managed-node1 28285 1727204264.99023: Calling groups_inventory to load vars for managed-node1 28285 1727204264.99025: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204264.99036: Calling all_plugins_play to load vars for managed-node1 28285 1727204264.99038: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204264.99041: Calling groups_plugins_play to load vars for managed-node1 28285 1727204264.99217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204264.99457: done with get_vars() 28285 1727204264.99470: done getting variables 28285 1727204264.99600: done sending task result for task 0affcd87-79f5-57a1-d976-00000000001d 28285 1727204264.99603: WORKER PROCESS EXITING 28285 1727204264.99641: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.058) 0:00:05.806 ***** 28285 1727204264.99683: entering _queue_task() for managed-node1/fail 28285 1727204265.00154: worker is 1 (out of 1 available) 28285 1727204265.00167: exiting _queue_task() for managed-node1/fail 28285 1727204265.00180: done queuing things up, now waiting for results queue to drain 28285 1727204265.00182: waiting for pending results... 28285 1727204265.00445: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204265.00593: in run() - task 0affcd87-79f5-57a1-d976-00000000001e 28285 1727204265.00612: variable 'ansible_search_path' from source: unknown 28285 1727204265.00626: variable 'ansible_search_path' from source: unknown 28285 1727204265.00670: calling self._execute() 28285 1727204265.00755: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.00767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.00780: variable 'omit' from source: magic vars 28285 1727204265.01291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.04109: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.04195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.04247: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.04300: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.04339: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.04439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.04478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.04518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.04573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.04594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.04763: variable 'ansible_distribution' from source: facts 28285 1727204265.04777: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.04799: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.04806: when evaluation is False, skipping this task 28285 1727204265.04812: _execute() done 28285 1727204265.04817: dumping result to json 28285 1727204265.04823: done dumping result, returning 28285 1727204265.04840: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-57a1-d976-00000000001e] 28285 1727204265.04852: sending task result for task 0affcd87-79f5-57a1-d976-00000000001e skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.05006: no more pending results, returning what we have 28285 1727204265.05010: results queue empty 28285 1727204265.05011: checking for any_errors_fatal 28285 1727204265.05017: done checking for any_errors_fatal 28285 1727204265.05018: checking for max_fail_percentage 28285 1727204265.05020: done checking for max_fail_percentage 28285 1727204265.05020: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.05021: done checking to see if all hosts have failed 28285 1727204265.05022: getting the remaining hosts for this loop 28285 1727204265.05024: done getting the remaining hosts for this loop 28285 1727204265.05027: getting the next task for host managed-node1 28285 1727204265.05034: done getting next task for host managed-node1 28285 1727204265.05038: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204265.05041: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.05055: getting variables 28285 1727204265.05056: in VariableManager get_vars() 28285 1727204265.05115: Calling all_inventory to load vars for managed-node1 28285 1727204265.05119: Calling groups_inventory to load vars for managed-node1 28285 1727204265.05121: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.05132: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.05134: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.05137: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.05347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.05550: done with get_vars() 28285 1727204265.05562: done getting variables 28285 1727204265.05714: done sending task result for task 0affcd87-79f5-57a1-d976-00000000001e 28285 1727204265.05717: WORKER PROCESS EXITING 28285 1727204265.05757: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.061) 0:00:05.867 ***** 28285 1727204265.05916: entering _queue_task() for managed-node1/fail 28285 1727204265.06263: worker is 1 (out of 1 available) 28285 1727204265.06277: exiting _queue_task() for managed-node1/fail 28285 1727204265.06289: done queuing things up, now waiting for results queue to drain 28285 1727204265.06291: waiting for pending results... 28285 1727204265.06553: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204265.06689: in run() - task 0affcd87-79f5-57a1-d976-00000000001f 28285 1727204265.06709: variable 'ansible_search_path' from source: unknown 28285 1727204265.06716: variable 'ansible_search_path' from source: unknown 28285 1727204265.06758: calling self._execute() 28285 1727204265.06850: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.06860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.06877: variable 'omit' from source: magic vars 28285 1727204265.07345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.09926: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.10025: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.10075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.10119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.10150: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.10235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.10268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.10312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.10356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.10376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.10527: variable 'ansible_distribution' from source: facts 28285 1727204265.10539: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.10561: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.10571: when evaluation is False, skipping this task 28285 1727204265.10577: _execute() done 28285 1727204265.10582: dumping result to json 28285 1727204265.10588: done dumping result, returning 28285 1727204265.10599: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-57a1-d976-00000000001f] 28285 1727204265.10612: sending task result for task 0affcd87-79f5-57a1-d976-00000000001f 28285 1727204265.10723: done sending task result for task 0affcd87-79f5-57a1-d976-00000000001f 28285 1727204265.10730: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.10777: no more pending results, returning what we have 28285 1727204265.10781: results queue empty 28285 1727204265.10782: checking for any_errors_fatal 28285 1727204265.10787: done checking for any_errors_fatal 28285 1727204265.10788: checking for max_fail_percentage 28285 1727204265.10790: done checking for max_fail_percentage 28285 1727204265.10791: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.10792: done checking to see if all hosts have failed 28285 1727204265.10793: getting the remaining hosts for this loop 28285 1727204265.10795: done getting the remaining hosts for this loop 28285 1727204265.10799: getting the next task for host managed-node1 28285 1727204265.10805: done getting next task for host managed-node1 28285 1727204265.10809: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204265.10813: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.10829: getting variables 28285 1727204265.10831: in VariableManager get_vars() 28285 1727204265.10888: Calling all_inventory to load vars for managed-node1 28285 1727204265.10891: Calling groups_inventory to load vars for managed-node1 28285 1727204265.10893: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.10903: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.10905: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.10908: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.11082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.11308: done with get_vars() 28285 1727204265.11321: done getting variables 28285 1727204265.11552: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.058) 0:00:05.926 ***** 28285 1727204265.11659: entering _queue_task() for managed-node1/dnf 28285 1727204265.12167: worker is 1 (out of 1 available) 28285 1727204265.12180: exiting _queue_task() for managed-node1/dnf 28285 1727204265.12192: done queuing things up, now waiting for results queue to drain 28285 1727204265.12194: waiting for pending results... 28285 1727204265.12508: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204265.12653: in run() - task 0affcd87-79f5-57a1-d976-000000000020 28285 1727204265.12681: variable 'ansible_search_path' from source: unknown 28285 1727204265.12690: variable 'ansible_search_path' from source: unknown 28285 1727204265.12735: calling self._execute() 28285 1727204265.12842: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.12867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.12881: variable 'omit' from source: magic vars 28285 1727204265.13433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.16276: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.16360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.16410: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.16456: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.16492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.16583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.16619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.16654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.16774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.16800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.17025: variable 'ansible_distribution' from source: facts 28285 1727204265.17042: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.17071: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.17080: when evaluation is False, skipping this task 28285 1727204265.17087: _execute() done 28285 1727204265.17094: dumping result to json 28285 1727204265.17100: done dumping result, returning 28285 1727204265.17116: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000020] 28285 1727204265.17128: sending task result for task 0affcd87-79f5-57a1-d976-000000000020 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.17304: no more pending results, returning what we have 28285 1727204265.17308: results queue empty 28285 1727204265.17309: checking for any_errors_fatal 28285 1727204265.17316: done checking for any_errors_fatal 28285 1727204265.17317: checking for max_fail_percentage 28285 1727204265.17319: done checking for max_fail_percentage 28285 1727204265.17320: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.17320: done checking to see if all hosts have failed 28285 1727204265.17321: getting the remaining hosts for this loop 28285 1727204265.17323: done getting the remaining hosts for this loop 28285 1727204265.17327: getting the next task for host managed-node1 28285 1727204265.17334: done getting next task for host managed-node1 28285 1727204265.17339: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204265.17342: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.17359: getting variables 28285 1727204265.17361: in VariableManager get_vars() 28285 1727204265.17421: Calling all_inventory to load vars for managed-node1 28285 1727204265.17424: Calling groups_inventory to load vars for managed-node1 28285 1727204265.17427: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.17438: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.17441: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.17444: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.17743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.18073: done with get_vars() 28285 1727204265.18085: done getting variables 28285 1727204265.18238: done sending task result for task 0affcd87-79f5-57a1-d976-000000000020 28285 1727204265.18242: WORKER PROCESS EXITING redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204265.18298: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.066) 0:00:05.993 ***** 28285 1727204265.18332: entering _queue_task() for managed-node1/yum 28285 1727204265.18334: Creating lock for yum 28285 1727204265.18823: worker is 1 (out of 1 available) 28285 1727204265.18837: exiting _queue_task() for managed-node1/yum 28285 1727204265.18849: done queuing things up, now waiting for results queue to drain 28285 1727204265.18850: waiting for pending results... 28285 1727204265.19310: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204265.19444: in run() - task 0affcd87-79f5-57a1-d976-000000000021 28285 1727204265.19474: variable 'ansible_search_path' from source: unknown 28285 1727204265.19483: variable 'ansible_search_path' from source: unknown 28285 1727204265.19528: calling self._execute() 28285 1727204265.19613: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.19623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.19638: variable 'omit' from source: magic vars 28285 1727204265.20288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.23175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.23337: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.23388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.23441: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.23478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.23578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.23616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.23657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.23706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.23729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.23890: variable 'ansible_distribution' from source: facts 28285 1727204265.23904: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.23932: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.23940: when evaluation is False, skipping this task 28285 1727204265.23951: _execute() done 28285 1727204265.23962: dumping result to json 28285 1727204265.23973: done dumping result, returning 28285 1727204265.23986: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000021] 28285 1727204265.23998: sending task result for task 0affcd87-79f5-57a1-d976-000000000021 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.24168: no more pending results, returning what we have 28285 1727204265.24172: results queue empty 28285 1727204265.24173: checking for any_errors_fatal 28285 1727204265.24181: done checking for any_errors_fatal 28285 1727204265.24182: checking for max_fail_percentage 28285 1727204265.24183: done checking for max_fail_percentage 28285 1727204265.24184: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.24185: done checking to see if all hosts have failed 28285 1727204265.24186: getting the remaining hosts for this loop 28285 1727204265.24188: done getting the remaining hosts for this loop 28285 1727204265.24192: getting the next task for host managed-node1 28285 1727204265.24201: done getting next task for host managed-node1 28285 1727204265.24206: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204265.24209: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.24222: getting variables 28285 1727204265.24224: in VariableManager get_vars() 28285 1727204265.24285: Calling all_inventory to load vars for managed-node1 28285 1727204265.24289: Calling groups_inventory to load vars for managed-node1 28285 1727204265.24291: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.24303: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.24305: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.24308: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.24502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.24715: done with get_vars() 28285 1727204265.24726: done getting variables 28285 1727204265.24758: done sending task result for task 0affcd87-79f5-57a1-d976-000000000021 28285 1727204265.24761: WORKER PROCESS EXITING 28285 1727204265.24796: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.064) 0:00:06.058 ***** 28285 1727204265.24834: entering _queue_task() for managed-node1/fail 28285 1727204265.25098: worker is 1 (out of 1 available) 28285 1727204265.25145: exiting _queue_task() for managed-node1/fail 28285 1727204265.25160: done queuing things up, now waiting for results queue to drain 28285 1727204265.25161: waiting for pending results... 28285 1727204265.25389: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204265.25534: in run() - task 0affcd87-79f5-57a1-d976-000000000022 28285 1727204265.25554: variable 'ansible_search_path' from source: unknown 28285 1727204265.25561: variable 'ansible_search_path' from source: unknown 28285 1727204265.25602: calling self._execute() 28285 1727204265.25704: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.25718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.25737: variable 'omit' from source: magic vars 28285 1727204265.26557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.28819: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.28869: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.28899: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.28925: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.28952: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.29017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.29072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.29126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.29174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.29197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.29358: variable 'ansible_distribution' from source: facts 28285 1727204265.29378: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.29400: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.29407: when evaluation is False, skipping this task 28285 1727204265.29413: _execute() done 28285 1727204265.29418: dumping result to json 28285 1727204265.29424: done dumping result, returning 28285 1727204265.29434: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000022] 28285 1727204265.29443: sending task result for task 0affcd87-79f5-57a1-d976-000000000022 28285 1727204265.29557: done sending task result for task 0affcd87-79f5-57a1-d976-000000000022 28285 1727204265.29566: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.29883: no more pending results, returning what we have 28285 1727204265.29886: results queue empty 28285 1727204265.29887: checking for any_errors_fatal 28285 1727204265.29894: done checking for any_errors_fatal 28285 1727204265.29895: checking for max_fail_percentage 28285 1727204265.29896: done checking for max_fail_percentage 28285 1727204265.29897: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.29898: done checking to see if all hosts have failed 28285 1727204265.29899: getting the remaining hosts for this loop 28285 1727204265.29900: done getting the remaining hosts for this loop 28285 1727204265.29904: getting the next task for host managed-node1 28285 1727204265.29912: done getting next task for host managed-node1 28285 1727204265.29916: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28285 1727204265.29919: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.29934: getting variables 28285 1727204265.29935: in VariableManager get_vars() 28285 1727204265.29984: Calling all_inventory to load vars for managed-node1 28285 1727204265.29986: Calling groups_inventory to load vars for managed-node1 28285 1727204265.29989: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.29997: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.30000: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.30006: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.30229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.30462: done with get_vars() 28285 1727204265.30478: done getting variables 28285 1727204265.30537: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.057) 0:00:06.115 ***** 28285 1727204265.30569: entering _queue_task() for managed-node1/package 28285 1727204265.30840: worker is 1 (out of 1 available) 28285 1727204265.30852: exiting _queue_task() for managed-node1/package 28285 1727204265.30865: done queuing things up, now waiting for results queue to drain 28285 1727204265.30867: waiting for pending results... 28285 1727204265.31124: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 28285 1727204265.31260: in run() - task 0affcd87-79f5-57a1-d976-000000000023 28285 1727204265.31282: variable 'ansible_search_path' from source: unknown 28285 1727204265.31287: variable 'ansible_search_path' from source: unknown 28285 1727204265.31328: calling self._execute() 28285 1727204265.31411: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.31426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.31461: variable 'omit' from source: magic vars 28285 1727204265.31909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.33881: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.33939: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.33978: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.34009: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.34061: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.34138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.34174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.34203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.34248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.34272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.34404: variable 'ansible_distribution' from source: facts 28285 1727204265.34414: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.34435: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.34441: when evaluation is False, skipping this task 28285 1727204265.34446: _execute() done 28285 1727204265.34451: dumping result to json 28285 1727204265.34461: done dumping result, returning 28285 1727204265.34483: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-57a1-d976-000000000023] 28285 1727204265.34501: sending task result for task 0affcd87-79f5-57a1-d976-000000000023 28285 1727204265.34795: done sending task result for task 0affcd87-79f5-57a1-d976-000000000023 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.34844: no more pending results, returning what we have 28285 1727204265.34847: results queue empty 28285 1727204265.34848: checking for any_errors_fatal 28285 1727204265.34853: done checking for any_errors_fatal 28285 1727204265.34854: checking for max_fail_percentage 28285 1727204265.34856: done checking for max_fail_percentage 28285 1727204265.34856: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.34857: done checking to see if all hosts have failed 28285 1727204265.34858: getting the remaining hosts for this loop 28285 1727204265.34859: done getting the remaining hosts for this loop 28285 1727204265.34863: getting the next task for host managed-node1 28285 1727204265.34871: done getting next task for host managed-node1 28285 1727204265.34875: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204265.34878: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.34890: getting variables 28285 1727204265.34892: in VariableManager get_vars() 28285 1727204265.34943: Calling all_inventory to load vars for managed-node1 28285 1727204265.34945: Calling groups_inventory to load vars for managed-node1 28285 1727204265.34948: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.34957: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.34960: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.34962: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.35140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.35353: done with get_vars() 28285 1727204265.35366: done getting variables 28285 1727204265.35429: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.048) 0:00:06.164 ***** 28285 1727204265.35622: entering _queue_task() for managed-node1/package 28285 1727204265.35638: WORKER PROCESS EXITING 28285 1727204265.36026: worker is 1 (out of 1 available) 28285 1727204265.36039: exiting _queue_task() for managed-node1/package 28285 1727204265.36054: done queuing things up, now waiting for results queue to drain 28285 1727204265.36056: waiting for pending results... 28285 1727204265.36305: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204265.36441: in run() - task 0affcd87-79f5-57a1-d976-000000000024 28285 1727204265.36461: variable 'ansible_search_path' from source: unknown 28285 1727204265.36473: variable 'ansible_search_path' from source: unknown 28285 1727204265.36519: calling self._execute() 28285 1727204265.36605: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.36616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.36627: variable 'omit' from source: magic vars 28285 1727204265.37062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.38752: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.38800: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.38829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.38856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.38879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.38937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.38958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.38980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.39012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.39022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.39127: variable 'ansible_distribution' from source: facts 28285 1727204265.39133: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.39147: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.39150: when evaluation is False, skipping this task 28285 1727204265.39155: _execute() done 28285 1727204265.39158: dumping result to json 28285 1727204265.39160: done dumping result, returning 28285 1727204265.39169: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-57a1-d976-000000000024] 28285 1727204265.39174: sending task result for task 0affcd87-79f5-57a1-d976-000000000024 28285 1727204265.39263: done sending task result for task 0affcd87-79f5-57a1-d976-000000000024 28285 1727204265.39268: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.39310: no more pending results, returning what we have 28285 1727204265.39314: results queue empty 28285 1727204265.39314: checking for any_errors_fatal 28285 1727204265.39321: done checking for any_errors_fatal 28285 1727204265.39321: checking for max_fail_percentage 28285 1727204265.39323: done checking for max_fail_percentage 28285 1727204265.39324: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.39325: done checking to see if all hosts have failed 28285 1727204265.39325: getting the remaining hosts for this loop 28285 1727204265.39327: done getting the remaining hosts for this loop 28285 1727204265.39330: getting the next task for host managed-node1 28285 1727204265.39336: done getting next task for host managed-node1 28285 1727204265.39339: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204265.39343: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.39356: getting variables 28285 1727204265.39358: in VariableManager get_vars() 28285 1727204265.39407: Calling all_inventory to load vars for managed-node1 28285 1727204265.39410: Calling groups_inventory to load vars for managed-node1 28285 1727204265.39412: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.39421: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.39423: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.39425: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.39579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.39701: done with get_vars() 28285 1727204265.39711: done getting variables 28285 1727204265.39752: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.043) 0:00:06.207 ***** 28285 1727204265.39776: entering _queue_task() for managed-node1/package 28285 1727204265.39961: worker is 1 (out of 1 available) 28285 1727204265.39976: exiting _queue_task() for managed-node1/package 28285 1727204265.39988: done queuing things up, now waiting for results queue to drain 28285 1727204265.39990: waiting for pending results... 28285 1727204265.40148: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204265.40231: in run() - task 0affcd87-79f5-57a1-d976-000000000025 28285 1727204265.40240: variable 'ansible_search_path' from source: unknown 28285 1727204265.40243: variable 'ansible_search_path' from source: unknown 28285 1727204265.40275: calling self._execute() 28285 1727204265.40335: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.40345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.40351: variable 'omit' from source: magic vars 28285 1727204265.40663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.42287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.42338: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.42369: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.42396: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.42419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.42479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.42500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.42521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.42548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.42561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.42660: variable 'ansible_distribution' from source: facts 28285 1727204265.42666: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.42682: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.42685: when evaluation is False, skipping this task 28285 1727204265.42687: _execute() done 28285 1727204265.42690: dumping result to json 28285 1727204265.42692: done dumping result, returning 28285 1727204265.42699: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-57a1-d976-000000000025] 28285 1727204265.42704: sending task result for task 0affcd87-79f5-57a1-d976-000000000025 28285 1727204265.42794: done sending task result for task 0affcd87-79f5-57a1-d976-000000000025 28285 1727204265.42796: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.42867: no more pending results, returning what we have 28285 1727204265.42870: results queue empty 28285 1727204265.42871: checking for any_errors_fatal 28285 1727204265.42878: done checking for any_errors_fatal 28285 1727204265.42879: checking for max_fail_percentage 28285 1727204265.42880: done checking for max_fail_percentage 28285 1727204265.42881: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.42882: done checking to see if all hosts have failed 28285 1727204265.42882: getting the remaining hosts for this loop 28285 1727204265.42884: done getting the remaining hosts for this loop 28285 1727204265.42888: getting the next task for host managed-node1 28285 1727204265.42893: done getting next task for host managed-node1 28285 1727204265.42897: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204265.42899: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.42912: getting variables 28285 1727204265.42913: in VariableManager get_vars() 28285 1727204265.42967: Calling all_inventory to load vars for managed-node1 28285 1727204265.42970: Calling groups_inventory to load vars for managed-node1 28285 1727204265.42972: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.42980: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.42982: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.42985: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.43096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.43222: done with get_vars() 28285 1727204265.43230: done getting variables 28285 1727204265.43300: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.035) 0:00:06.243 ***** 28285 1727204265.43323: entering _queue_task() for managed-node1/service 28285 1727204265.43324: Creating lock for service 28285 1727204265.43523: worker is 1 (out of 1 available) 28285 1727204265.43537: exiting _queue_task() for managed-node1/service 28285 1727204265.43548: done queuing things up, now waiting for results queue to drain 28285 1727204265.43549: waiting for pending results... 28285 1727204265.43712: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204265.43790: in run() - task 0affcd87-79f5-57a1-d976-000000000026 28285 1727204265.43800: variable 'ansible_search_path' from source: unknown 28285 1727204265.43803: variable 'ansible_search_path' from source: unknown 28285 1727204265.43833: calling self._execute() 28285 1727204265.43896: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.43900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.43908: variable 'omit' from source: magic vars 28285 1727204265.44215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.45826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.45874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.45906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.45932: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.45956: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.46014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.46034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.46055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.46084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.46097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.46197: variable 'ansible_distribution' from source: facts 28285 1727204265.46204: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.46223: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.46226: when evaluation is False, skipping this task 28285 1727204265.46229: _execute() done 28285 1727204265.46231: dumping result to json 28285 1727204265.46233: done dumping result, returning 28285 1727204265.46239: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000026] 28285 1727204265.46245: sending task result for task 0affcd87-79f5-57a1-d976-000000000026 28285 1727204265.46331: done sending task result for task 0affcd87-79f5-57a1-d976-000000000026 28285 1727204265.46334: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.46408: no more pending results, returning what we have 28285 1727204265.46411: results queue empty 28285 1727204265.46412: checking for any_errors_fatal 28285 1727204265.46416: done checking for any_errors_fatal 28285 1727204265.46417: checking for max_fail_percentage 28285 1727204265.46418: done checking for max_fail_percentage 28285 1727204265.46419: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.46420: done checking to see if all hosts have failed 28285 1727204265.46421: getting the remaining hosts for this loop 28285 1727204265.46422: done getting the remaining hosts for this loop 28285 1727204265.46426: getting the next task for host managed-node1 28285 1727204265.46432: done getting next task for host managed-node1 28285 1727204265.46436: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204265.46439: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.46457: getting variables 28285 1727204265.46459: in VariableManager get_vars() 28285 1727204265.46505: Calling all_inventory to load vars for managed-node1 28285 1727204265.46507: Calling groups_inventory to load vars for managed-node1 28285 1727204265.46509: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.46515: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.46517: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.46518: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.46668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.46790: done with get_vars() 28285 1727204265.46797: done getting variables 28285 1727204265.46836: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.035) 0:00:06.278 ***** 28285 1727204265.46859: entering _queue_task() for managed-node1/service 28285 1727204265.47051: worker is 1 (out of 1 available) 28285 1727204265.47063: exiting _queue_task() for managed-node1/service 28285 1727204265.47076: done queuing things up, now waiting for results queue to drain 28285 1727204265.47078: waiting for pending results... 28285 1727204265.47231: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204265.47311: in run() - task 0affcd87-79f5-57a1-d976-000000000027 28285 1727204265.47322: variable 'ansible_search_path' from source: unknown 28285 1727204265.47325: variable 'ansible_search_path' from source: unknown 28285 1727204265.47355: calling self._execute() 28285 1727204265.47411: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.47421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.47432: variable 'omit' from source: magic vars 28285 1727204265.47725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.49299: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.49351: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.49380: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.49406: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.49425: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.49484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.49507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.49525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.49553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.49566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.49675: variable 'ansible_distribution' from source: facts 28285 1727204265.49679: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.49693: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.49698: when evaluation is False, skipping this task 28285 1727204265.49700: _execute() done 28285 1727204265.49706: dumping result to json 28285 1727204265.49709: done dumping result, returning 28285 1727204265.49712: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-57a1-d976-000000000027] 28285 1727204265.49718: sending task result for task 0affcd87-79f5-57a1-d976-000000000027 28285 1727204265.49804: done sending task result for task 0affcd87-79f5-57a1-d976-000000000027 28285 1727204265.49807: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204265.49870: no more pending results, returning what we have 28285 1727204265.49873: results queue empty 28285 1727204265.49874: checking for any_errors_fatal 28285 1727204265.49882: done checking for any_errors_fatal 28285 1727204265.49882: checking for max_fail_percentage 28285 1727204265.49884: done checking for max_fail_percentage 28285 1727204265.49885: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.49886: done checking to see if all hosts have failed 28285 1727204265.49886: getting the remaining hosts for this loop 28285 1727204265.49888: done getting the remaining hosts for this loop 28285 1727204265.49891: getting the next task for host managed-node1 28285 1727204265.49897: done getting next task for host managed-node1 28285 1727204265.49901: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204265.49904: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.49917: getting variables 28285 1727204265.49918: in VariableManager get_vars() 28285 1727204265.49972: Calling all_inventory to load vars for managed-node1 28285 1727204265.49975: Calling groups_inventory to load vars for managed-node1 28285 1727204265.49977: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.49986: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.49988: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.49990: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.50101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.50224: done with get_vars() 28285 1727204265.50232: done getting variables 28285 1727204265.50281: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.034) 0:00:06.313 ***** 28285 1727204265.50305: entering _queue_task() for managed-node1/service 28285 1727204265.50498: worker is 1 (out of 1 available) 28285 1727204265.50510: exiting _queue_task() for managed-node1/service 28285 1727204265.50524: done queuing things up, now waiting for results queue to drain 28285 1727204265.50525: waiting for pending results... 28285 1727204265.50680: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204265.50762: in run() - task 0affcd87-79f5-57a1-d976-000000000028 28285 1727204265.50774: variable 'ansible_search_path' from source: unknown 28285 1727204265.50777: variable 'ansible_search_path' from source: unknown 28285 1727204265.50806: calling self._execute() 28285 1727204265.50869: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.50875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.50884: variable 'omit' from source: magic vars 28285 1727204265.51191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.52828: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.52877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.52904: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.52930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.52949: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.53010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.53028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.53046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.53083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.53090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.53190: variable 'ansible_distribution' from source: facts 28285 1727204265.53199: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.53214: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.53216: when evaluation is False, skipping this task 28285 1727204265.53219: _execute() done 28285 1727204265.53222: dumping result to json 28285 1727204265.53224: done dumping result, returning 28285 1727204265.53232: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-57a1-d976-000000000028] 28285 1727204265.53238: sending task result for task 0affcd87-79f5-57a1-d976-000000000028 28285 1727204265.53322: done sending task result for task 0affcd87-79f5-57a1-d976-000000000028 28285 1727204265.53325: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.53398: no more pending results, returning what we have 28285 1727204265.53401: results queue empty 28285 1727204265.53402: checking for any_errors_fatal 28285 1727204265.53409: done checking for any_errors_fatal 28285 1727204265.53409: checking for max_fail_percentage 28285 1727204265.53411: done checking for max_fail_percentage 28285 1727204265.53412: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.53413: done checking to see if all hosts have failed 28285 1727204265.53413: getting the remaining hosts for this loop 28285 1727204265.53415: done getting the remaining hosts for this loop 28285 1727204265.53419: getting the next task for host managed-node1 28285 1727204265.53424: done getting next task for host managed-node1 28285 1727204265.53428: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204265.53431: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.53443: getting variables 28285 1727204265.53444: in VariableManager get_vars() 28285 1727204265.53493: Calling all_inventory to load vars for managed-node1 28285 1727204265.53496: Calling groups_inventory to load vars for managed-node1 28285 1727204265.53498: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.53505: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.53507: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.53509: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.53662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.53784: done with get_vars() 28285 1727204265.53791: done getting variables 28285 1727204265.53830: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.035) 0:00:06.348 ***** 28285 1727204265.53855: entering _queue_task() for managed-node1/service 28285 1727204265.54043: worker is 1 (out of 1 available) 28285 1727204265.54057: exiting _queue_task() for managed-node1/service 28285 1727204265.54069: done queuing things up, now waiting for results queue to drain 28285 1727204265.54071: waiting for pending results... 28285 1727204265.54225: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204265.54305: in run() - task 0affcd87-79f5-57a1-d976-000000000029 28285 1727204265.54315: variable 'ansible_search_path' from source: unknown 28285 1727204265.54318: variable 'ansible_search_path' from source: unknown 28285 1727204265.54345: calling self._execute() 28285 1727204265.54406: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.54411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.54419: variable 'omit' from source: magic vars 28285 1727204265.54714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.56816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.56915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.56970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.57029: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.57072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.57165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.57223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.57258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.57318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.57339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.57529: variable 'ansible_distribution' from source: facts 28285 1727204265.57541: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.57567: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.57575: when evaluation is False, skipping this task 28285 1727204265.57582: _execute() done 28285 1727204265.57594: dumping result to json 28285 1727204265.57608: done dumping result, returning 28285 1727204265.57620: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-57a1-d976-000000000029] 28285 1727204265.57633: sending task result for task 0affcd87-79f5-57a1-d976-000000000029 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204265.57822: no more pending results, returning what we have 28285 1727204265.57825: results queue empty 28285 1727204265.57826: checking for any_errors_fatal 28285 1727204265.57835: done checking for any_errors_fatal 28285 1727204265.57836: checking for max_fail_percentage 28285 1727204265.57837: done checking for max_fail_percentage 28285 1727204265.57838: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.57839: done checking to see if all hosts have failed 28285 1727204265.57840: getting the remaining hosts for this loop 28285 1727204265.57842: done getting the remaining hosts for this loop 28285 1727204265.57846: getting the next task for host managed-node1 28285 1727204265.57855: done getting next task for host managed-node1 28285 1727204265.57859: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204265.57862: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.57879: getting variables 28285 1727204265.57881: in VariableManager get_vars() 28285 1727204265.57943: Calling all_inventory to load vars for managed-node1 28285 1727204265.57947: Calling groups_inventory to load vars for managed-node1 28285 1727204265.57950: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.57966: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.57969: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.57972: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.58174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.58422: done with get_vars() 28285 1727204265.58436: done getting variables 28285 1727204265.58621: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204265.58660: done sending task result for task 0affcd87-79f5-57a1-d976-000000000029 28285 1727204265.58663: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.048) 0:00:06.396 ***** 28285 1727204265.58680: entering _queue_task() for managed-node1/copy 28285 1727204265.59026: worker is 1 (out of 1 available) 28285 1727204265.59037: exiting _queue_task() for managed-node1/copy 28285 1727204265.59048: done queuing things up, now waiting for results queue to drain 28285 1727204265.59050: waiting for pending results... 28285 1727204265.59231: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204265.59310: in run() - task 0affcd87-79f5-57a1-d976-00000000002a 28285 1727204265.59321: variable 'ansible_search_path' from source: unknown 28285 1727204265.59324: variable 'ansible_search_path' from source: unknown 28285 1727204265.59359: calling self._execute() 28285 1727204265.59431: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.59435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.59443: variable 'omit' from source: magic vars 28285 1727204265.59770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.62186: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.62257: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.62303: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.62345: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.62379: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.62468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.62504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.62542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.62594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.62615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.62781: variable 'ansible_distribution' from source: facts 28285 1727204265.62796: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.62820: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.62832: when evaluation is False, skipping this task 28285 1727204265.62838: _execute() done 28285 1727204265.62845: dumping result to json 28285 1727204265.62857: done dumping result, returning 28285 1727204265.62872: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-57a1-d976-00000000002a] 28285 1727204265.62882: sending task result for task 0affcd87-79f5-57a1-d976-00000000002a skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.63035: no more pending results, returning what we have 28285 1727204265.63039: results queue empty 28285 1727204265.63040: checking for any_errors_fatal 28285 1727204265.63046: done checking for any_errors_fatal 28285 1727204265.63047: checking for max_fail_percentage 28285 1727204265.63049: done checking for max_fail_percentage 28285 1727204265.63050: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.63050: done checking to see if all hosts have failed 28285 1727204265.63051: getting the remaining hosts for this loop 28285 1727204265.63053: done getting the remaining hosts for this loop 28285 1727204265.63057: getting the next task for host managed-node1 28285 1727204265.63066: done getting next task for host managed-node1 28285 1727204265.63071: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204265.63075: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.63090: getting variables 28285 1727204265.63092: in VariableManager get_vars() 28285 1727204265.63151: Calling all_inventory to load vars for managed-node1 28285 1727204265.63154: Calling groups_inventory to load vars for managed-node1 28285 1727204265.63157: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.63169: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.63172: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.63175: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.63409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.63617: done with get_vars() 28285 1727204265.63628: done getting variables 28285 1727204265.63950: done sending task result for task 0affcd87-79f5-57a1-d976-00000000002a 28285 1727204265.63953: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.053) 0:00:06.450 ***** 28285 1727204265.64011: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204265.64013: Creating lock for fedora.linux_system_roles.network_connections 28285 1727204265.64571: worker is 1 (out of 1 available) 28285 1727204265.64581: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204265.64594: done queuing things up, now waiting for results queue to drain 28285 1727204265.64596: waiting for pending results... 28285 1727204265.64956: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204265.65241: in run() - task 0affcd87-79f5-57a1-d976-00000000002b 28285 1727204265.65328: variable 'ansible_search_path' from source: unknown 28285 1727204265.65337: variable 'ansible_search_path' from source: unknown 28285 1727204265.65382: calling self._execute() 28285 1727204265.65461: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.65478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.65493: variable 'omit' from source: magic vars 28285 1727204265.66039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.71262: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.71460: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.71548: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.71686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.71732: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.71814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.71851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.71886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.71940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.71961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.72109: variable 'ansible_distribution' from source: facts 28285 1727204265.72121: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.72144: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.72155: when evaluation is False, skipping this task 28285 1727204265.72161: _execute() done 28285 1727204265.72170: dumping result to json 28285 1727204265.72177: done dumping result, returning 28285 1727204265.72187: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-57a1-d976-00000000002b] 28285 1727204265.72197: sending task result for task 0affcd87-79f5-57a1-d976-00000000002b skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.72347: no more pending results, returning what we have 28285 1727204265.72350: results queue empty 28285 1727204265.72351: checking for any_errors_fatal 28285 1727204265.72358: done checking for any_errors_fatal 28285 1727204265.72359: checking for max_fail_percentage 28285 1727204265.72361: done checking for max_fail_percentage 28285 1727204265.72361: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.72362: done checking to see if all hosts have failed 28285 1727204265.72363: getting the remaining hosts for this loop 28285 1727204265.72367: done getting the remaining hosts for this loop 28285 1727204265.72371: getting the next task for host managed-node1 28285 1727204265.72377: done getting next task for host managed-node1 28285 1727204265.72381: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204265.72384: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.72398: getting variables 28285 1727204265.72400: in VariableManager get_vars() 28285 1727204265.72462: Calling all_inventory to load vars for managed-node1 28285 1727204265.72468: Calling groups_inventory to load vars for managed-node1 28285 1727204265.72471: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.72482: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.72484: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.72487: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.72655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.72851: done with get_vars() 28285 1727204265.72865: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.089) 0:00:06.539 ***** 28285 1727204265.72958: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204265.72960: Creating lock for fedora.linux_system_roles.network_state 28285 1727204265.73287: done sending task result for task 0affcd87-79f5-57a1-d976-00000000002b 28285 1727204265.73290: WORKER PROCESS EXITING 28285 1727204265.73610: worker is 1 (out of 1 available) 28285 1727204265.73623: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204265.73636: done queuing things up, now waiting for results queue to drain 28285 1727204265.73638: waiting for pending results... 28285 1727204265.74179: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204265.74553: in run() - task 0affcd87-79f5-57a1-d976-00000000002c 28285 1727204265.74962: variable 'ansible_search_path' from source: unknown 28285 1727204265.74975: variable 'ansible_search_path' from source: unknown 28285 1727204265.75015: calling self._execute() 28285 1727204265.75097: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.75108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.75120: variable 'omit' from source: magic vars 28285 1727204265.76015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.80327: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.81154: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.81201: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.81243: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.81279: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.82051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.82088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.82118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.82171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.82192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.82339: variable 'ansible_distribution' from source: facts 28285 1727204265.82354: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.82381: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.82388: when evaluation is False, skipping this task 28285 1727204265.82394: _execute() done 28285 1727204265.82400: dumping result to json 28285 1727204265.82407: done dumping result, returning 28285 1727204265.82418: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-57a1-d976-00000000002c] 28285 1727204265.82431: sending task result for task 0affcd87-79f5-57a1-d976-00000000002c skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204265.82587: no more pending results, returning what we have 28285 1727204265.82591: results queue empty 28285 1727204265.82592: checking for any_errors_fatal 28285 1727204265.82599: done checking for any_errors_fatal 28285 1727204265.82600: checking for max_fail_percentage 28285 1727204265.82602: done checking for max_fail_percentage 28285 1727204265.82603: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.82604: done checking to see if all hosts have failed 28285 1727204265.82605: getting the remaining hosts for this loop 28285 1727204265.82607: done getting the remaining hosts for this loop 28285 1727204265.82611: getting the next task for host managed-node1 28285 1727204265.82618: done getting next task for host managed-node1 28285 1727204265.82622: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204265.82626: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.82640: getting variables 28285 1727204265.82642: in VariableManager get_vars() 28285 1727204265.82702: Calling all_inventory to load vars for managed-node1 28285 1727204265.82705: Calling groups_inventory to load vars for managed-node1 28285 1727204265.82708: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.82719: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.82721: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.82724: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.82956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.83440: done with get_vars() 28285 1727204265.83454: done getting variables 28285 1727204265.83486: done sending task result for task 0affcd87-79f5-57a1-d976-00000000002c 28285 1727204265.83489: WORKER PROCESS EXITING 28285 1727204265.83524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.106) 0:00:06.645 ***** 28285 1727204265.83561: entering _queue_task() for managed-node1/debug 28285 1727204265.83797: worker is 1 (out of 1 available) 28285 1727204265.83808: exiting _queue_task() for managed-node1/debug 28285 1727204265.83820: done queuing things up, now waiting for results queue to drain 28285 1727204265.83821: waiting for pending results... 28285 1727204265.84620: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204265.84788: in run() - task 0affcd87-79f5-57a1-d976-00000000002d 28285 1727204265.84955: variable 'ansible_search_path' from source: unknown 28285 1727204265.84963: variable 'ansible_search_path' from source: unknown 28285 1727204265.85007: calling self._execute() 28285 1727204265.85096: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.85108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.85122: variable 'omit' from source: magic vars 28285 1727204265.85812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204265.91269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204265.92139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204265.92184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204265.92907: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204265.92940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204265.93023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204265.93057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204265.93090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204265.93133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204265.93153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204265.93295: variable 'ansible_distribution' from source: facts 28285 1727204265.93307: variable 'ansible_distribution_major_version' from source: facts 28285 1727204265.93330: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204265.93338: when evaluation is False, skipping this task 28285 1727204265.93344: _execute() done 28285 1727204265.93351: dumping result to json 28285 1727204265.93358: done dumping result, returning 28285 1727204265.93374: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-57a1-d976-00000000002d] 28285 1727204265.93386: sending task result for task 0affcd87-79f5-57a1-d976-00000000002d skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204265.93529: no more pending results, returning what we have 28285 1727204265.93533: results queue empty 28285 1727204265.93534: checking for any_errors_fatal 28285 1727204265.93543: done checking for any_errors_fatal 28285 1727204265.93544: checking for max_fail_percentage 28285 1727204265.93546: done checking for max_fail_percentage 28285 1727204265.93547: checking to see if all hosts have failed and the running result is not ok 28285 1727204265.93547: done checking to see if all hosts have failed 28285 1727204265.93548: getting the remaining hosts for this loop 28285 1727204265.93550: done getting the remaining hosts for this loop 28285 1727204265.93554: getting the next task for host managed-node1 28285 1727204265.93562: done getting next task for host managed-node1 28285 1727204265.93569: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204265.93572: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204265.93586: getting variables 28285 1727204265.93588: in VariableManager get_vars() 28285 1727204265.93648: Calling all_inventory to load vars for managed-node1 28285 1727204265.93651: Calling groups_inventory to load vars for managed-node1 28285 1727204265.93654: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204265.93666: Calling all_plugins_play to load vars for managed-node1 28285 1727204265.93669: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204265.93672: Calling groups_plugins_play to load vars for managed-node1 28285 1727204265.93847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204265.94052: done with get_vars() 28285 1727204265.94066: done getting variables 28285 1727204265.94129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.106) 0:00:06.751 ***** 28285 1727204265.94445: entering _queue_task() for managed-node1/debug 28285 1727204265.94460: done sending task result for task 0affcd87-79f5-57a1-d976-00000000002d 28285 1727204265.94463: WORKER PROCESS EXITING 28285 1727204265.94706: worker is 1 (out of 1 available) 28285 1727204265.94720: exiting _queue_task() for managed-node1/debug 28285 1727204265.94732: done queuing things up, now waiting for results queue to drain 28285 1727204265.94733: waiting for pending results... 28285 1727204265.95642: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204265.95950: in run() - task 0affcd87-79f5-57a1-d976-00000000002e 28285 1727204265.95972: variable 'ansible_search_path' from source: unknown 28285 1727204265.95980: variable 'ansible_search_path' from source: unknown 28285 1727204265.96019: calling self._execute() 28285 1727204265.96226: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204265.96237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204265.96251: variable 'omit' from source: magic vars 28285 1727204265.97124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.01888: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.02081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.02157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.02261: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.02362: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.02560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.02597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.02629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.02703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.02788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.03051: variable 'ansible_distribution' from source: facts 28285 1727204266.03207: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.03232: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.03240: when evaluation is False, skipping this task 28285 1727204266.03247: _execute() done 28285 1727204266.03253: dumping result to json 28285 1727204266.03261: done dumping result, returning 28285 1727204266.03277: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-57a1-d976-00000000002e] 28285 1727204266.03289: sending task result for task 0affcd87-79f5-57a1-d976-00000000002e 28285 1727204266.03399: done sending task result for task 0affcd87-79f5-57a1-d976-00000000002e 28285 1727204266.03406: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204266.03454: no more pending results, returning what we have 28285 1727204266.03458: results queue empty 28285 1727204266.03458: checking for any_errors_fatal 28285 1727204266.03465: done checking for any_errors_fatal 28285 1727204266.03466: checking for max_fail_percentage 28285 1727204266.03468: done checking for max_fail_percentage 28285 1727204266.03469: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.03470: done checking to see if all hosts have failed 28285 1727204266.03471: getting the remaining hosts for this loop 28285 1727204266.03472: done getting the remaining hosts for this loop 28285 1727204266.03476: getting the next task for host managed-node1 28285 1727204266.03483: done getting next task for host managed-node1 28285 1727204266.03488: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204266.03491: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.03505: getting variables 28285 1727204266.03507: in VariableManager get_vars() 28285 1727204266.03565: Calling all_inventory to load vars for managed-node1 28285 1727204266.03568: Calling groups_inventory to load vars for managed-node1 28285 1727204266.03571: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.03582: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.03585: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.03588: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.03819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.04004: done with get_vars() 28285 1727204266.04014: done getting variables 28285 1727204266.04978: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.108) 0:00:06.860 ***** 28285 1727204266.05012: entering _queue_task() for managed-node1/debug 28285 1727204266.05269: worker is 1 (out of 1 available) 28285 1727204266.05281: exiting _queue_task() for managed-node1/debug 28285 1727204266.05293: done queuing things up, now waiting for results queue to drain 28285 1727204266.05294: waiting for pending results... 28285 1727204266.06017: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204266.06128: in run() - task 0affcd87-79f5-57a1-d976-00000000002f 28285 1727204266.06141: variable 'ansible_search_path' from source: unknown 28285 1727204266.06145: variable 'ansible_search_path' from source: unknown 28285 1727204266.06575: calling self._execute() 28285 1727204266.06668: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.06681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.06695: variable 'omit' from source: magic vars 28285 1727204266.07539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.10353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.10719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.10772: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.10811: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.10843: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.10932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.10975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.11007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.11055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.11083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.11998: variable 'ansible_distribution' from source: facts 28285 1727204266.12011: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.12036: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.12044: when evaluation is False, skipping this task 28285 1727204266.12054: _execute() done 28285 1727204266.12060: dumping result to json 28285 1727204266.12072: done dumping result, returning 28285 1727204266.12085: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-57a1-d976-00000000002f] 28285 1727204266.12097: sending task result for task 0affcd87-79f5-57a1-d976-00000000002f skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204266.12244: no more pending results, returning what we have 28285 1727204266.12247: results queue empty 28285 1727204266.12248: checking for any_errors_fatal 28285 1727204266.12255: done checking for any_errors_fatal 28285 1727204266.12256: checking for max_fail_percentage 28285 1727204266.12258: done checking for max_fail_percentage 28285 1727204266.12259: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.12259: done checking to see if all hosts have failed 28285 1727204266.12260: getting the remaining hosts for this loop 28285 1727204266.12262: done getting the remaining hosts for this loop 28285 1727204266.12268: getting the next task for host managed-node1 28285 1727204266.12274: done getting next task for host managed-node1 28285 1727204266.12279: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204266.12283: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.12299: getting variables 28285 1727204266.12301: in VariableManager get_vars() 28285 1727204266.12355: Calling all_inventory to load vars for managed-node1 28285 1727204266.12359: Calling groups_inventory to load vars for managed-node1 28285 1727204266.12361: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.12374: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.12377: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.12380: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.12556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.12760: done with get_vars() 28285 1727204266.12773: done getting variables 28285 1727204266.12808: done sending task result for task 0affcd87-79f5-57a1-d976-00000000002f 28285 1727204266.12811: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.078) 0:00:06.938 ***** 28285 1727204266.12878: entering _queue_task() for managed-node1/ping 28285 1727204266.12880: Creating lock for ping 28285 1727204266.13487: worker is 1 (out of 1 available) 28285 1727204266.13500: exiting _queue_task() for managed-node1/ping 28285 1727204266.13513: done queuing things up, now waiting for results queue to drain 28285 1727204266.13515: waiting for pending results... 28285 1727204266.14212: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204266.14327: in run() - task 0affcd87-79f5-57a1-d976-000000000030 28285 1727204266.14685: variable 'ansible_search_path' from source: unknown 28285 1727204266.14694: variable 'ansible_search_path' from source: unknown 28285 1727204266.14735: calling self._execute() 28285 1727204266.14819: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.14832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.14847: variable 'omit' from source: magic vars 28285 1727204266.15683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.19467: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.19545: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.19594: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.19636: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.19672: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.19759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.19796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.19833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.19883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.19901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.20054: variable 'ansible_distribution' from source: facts 28285 1727204266.20068: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.20090: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.20097: when evaluation is False, skipping this task 28285 1727204266.20103: _execute() done 28285 1727204266.20108: dumping result to json 28285 1727204266.20114: done dumping result, returning 28285 1727204266.20125: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-57a1-d976-000000000030] 28285 1727204266.20138: sending task result for task 0affcd87-79f5-57a1-d976-000000000030 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.20284: no more pending results, returning what we have 28285 1727204266.20288: results queue empty 28285 1727204266.20289: checking for any_errors_fatal 28285 1727204266.20294: done checking for any_errors_fatal 28285 1727204266.20295: checking for max_fail_percentage 28285 1727204266.20297: done checking for max_fail_percentage 28285 1727204266.20298: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.20299: done checking to see if all hosts have failed 28285 1727204266.20299: getting the remaining hosts for this loop 28285 1727204266.20301: done getting the remaining hosts for this loop 28285 1727204266.20305: getting the next task for host managed-node1 28285 1727204266.20313: done getting next task for host managed-node1 28285 1727204266.20315: ^ task is: TASK: meta (role_complete) 28285 1727204266.20317: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.20335: getting variables 28285 1727204266.20337: in VariableManager get_vars() 28285 1727204266.20392: Calling all_inventory to load vars for managed-node1 28285 1727204266.20395: Calling groups_inventory to load vars for managed-node1 28285 1727204266.20397: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.20408: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.20410: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.20413: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.20618: done sending task result for task 0affcd87-79f5-57a1-d976-000000000030 28285 1727204266.20623: WORKER PROCESS EXITING 28285 1727204266.20642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.20843: done with get_vars() 28285 1727204266.20854: done getting variables 28285 1727204266.20936: done queuing things up, now waiting for results queue to drain 28285 1727204266.20938: results queue empty 28285 1727204266.20939: checking for any_errors_fatal 28285 1727204266.20941: done checking for any_errors_fatal 28285 1727204266.20942: checking for max_fail_percentage 28285 1727204266.20943: done checking for max_fail_percentage 28285 1727204266.20944: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.20944: done checking to see if all hosts have failed 28285 1727204266.20945: getting the remaining hosts for this loop 28285 1727204266.20946: done getting the remaining hosts for this loop 28285 1727204266.20948: getting the next task for host managed-node1 28285 1727204266.20952: done getting next task for host managed-node1 28285 1727204266.20954: ^ task is: TASK: Get current device features 28285 1727204266.20956: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.20958: getting variables 28285 1727204266.20959: in VariableManager get_vars() 28285 1727204266.20981: Calling all_inventory to load vars for managed-node1 28285 1727204266.20984: Calling groups_inventory to load vars for managed-node1 28285 1727204266.20986: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.20991: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.20993: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.20996: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.21128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.22019: done with get_vars() 28285 1727204266.22028: done getting variables 28285 1727204266.22069: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get current device features] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:53 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.092) 0:00:07.030 ***** 28285 1727204266.22094: entering _queue_task() for managed-node1/command 28285 1727204266.22368: worker is 1 (out of 1 available) 28285 1727204266.22379: exiting _queue_task() for managed-node1/command 28285 1727204266.22391: done queuing things up, now waiting for results queue to drain 28285 1727204266.22392: waiting for pending results... 28285 1727204266.23255: running TaskExecutor() for managed-node1/TASK: Get current device features 28285 1727204266.23473: in run() - task 0affcd87-79f5-57a1-d976-000000000060 28285 1727204266.23593: variable 'ansible_search_path' from source: unknown 28285 1727204266.23692: calling self._execute() 28285 1727204266.23882: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.23953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.23972: variable 'omit' from source: magic vars 28285 1727204266.24999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.30705: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.30841: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.30922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.31026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.31126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.31320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.31355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.31388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.31467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.31551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.31813: variable 'ansible_distribution' from source: facts 28285 1727204266.31868: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.31892: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.31942: when evaluation is False, skipping this task 28285 1727204266.31950: _execute() done 28285 1727204266.31956: dumping result to json 28285 1727204266.31968: done dumping result, returning 28285 1727204266.31984: done running TaskExecutor() for managed-node1/TASK: Get current device features [0affcd87-79f5-57a1-d976-000000000060] 28285 1727204266.31995: sending task result for task 0affcd87-79f5-57a1-d976-000000000060 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.32143: no more pending results, returning what we have 28285 1727204266.32147: results queue empty 28285 1727204266.32148: checking for any_errors_fatal 28285 1727204266.32150: done checking for any_errors_fatal 28285 1727204266.32151: checking for max_fail_percentage 28285 1727204266.32153: done checking for max_fail_percentage 28285 1727204266.32154: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.32154: done checking to see if all hosts have failed 28285 1727204266.32155: getting the remaining hosts for this loop 28285 1727204266.32157: done getting the remaining hosts for this loop 28285 1727204266.32161: getting the next task for host managed-node1 28285 1727204266.32169: done getting next task for host managed-node1 28285 1727204266.32172: ^ task is: TASK: ASSERT: The profile does not change the ethtool features 28285 1727204266.32174: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.32178: getting variables 28285 1727204266.32180: in VariableManager get_vars() 28285 1727204266.32241: Calling all_inventory to load vars for managed-node1 28285 1727204266.32244: Calling groups_inventory to load vars for managed-node1 28285 1727204266.32247: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.32260: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.32262: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.32267: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.32444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.32694: done with get_vars() 28285 1727204266.32705: done getting variables 28285 1727204266.33035: done sending task result for task 0affcd87-79f5-57a1-d976-000000000060 28285 1727204266.33038: WORKER PROCESS EXITING 28285 1727204266.33113: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [ASSERT: The profile does not change the ethtool features] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:57 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.110) 0:00:07.141 ***** 28285 1727204266.33144: entering _queue_task() for managed-node1/assert 28285 1727204266.33146: Creating lock for assert 28285 1727204266.33465: worker is 1 (out of 1 available) 28285 1727204266.33477: exiting _queue_task() for managed-node1/assert 28285 1727204266.33490: done queuing things up, now waiting for results queue to drain 28285 1727204266.33499: waiting for pending results... 28285 1727204266.33773: running TaskExecutor() for managed-node1/TASK: ASSERT: The profile does not change the ethtool features 28285 1727204266.33869: in run() - task 0affcd87-79f5-57a1-d976-000000000061 28285 1727204266.33886: variable 'ansible_search_path' from source: unknown 28285 1727204266.33924: calling self._execute() 28285 1727204266.34022: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.34036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.34056: variable 'omit' from source: magic vars 28285 1727204266.34514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.37237: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.37304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.37364: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.37404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.37441: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.37530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.37575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.37606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.37661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.37684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.37961: variable 'ansible_distribution' from source: facts 28285 1727204266.37978: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.38004: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.38017: when evaluation is False, skipping this task 28285 1727204266.38024: _execute() done 28285 1727204266.38032: dumping result to json 28285 1727204266.38039: done dumping result, returning 28285 1727204266.38051: done running TaskExecutor() for managed-node1/TASK: ASSERT: The profile does not change the ethtool features [0affcd87-79f5-57a1-d976-000000000061] 28285 1727204266.38062: sending task result for task 0affcd87-79f5-57a1-d976-000000000061 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.38222: no more pending results, returning what we have 28285 1727204266.38225: results queue empty 28285 1727204266.38226: checking for any_errors_fatal 28285 1727204266.38232: done checking for any_errors_fatal 28285 1727204266.38233: checking for max_fail_percentage 28285 1727204266.38234: done checking for max_fail_percentage 28285 1727204266.38235: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.38236: done checking to see if all hosts have failed 28285 1727204266.38237: getting the remaining hosts for this loop 28285 1727204266.38239: done getting the remaining hosts for this loop 28285 1727204266.38242: getting the next task for host managed-node1 28285 1727204266.38249: done getting next task for host managed-node1 28285 1727204266.38252: ^ task is: TASK: TEST: I can disable gro and tx-tcp-segmentation and enable gso. 28285 1727204266.38254: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.38258: getting variables 28285 1727204266.38259: in VariableManager get_vars() 28285 1727204266.38319: Calling all_inventory to load vars for managed-node1 28285 1727204266.38322: Calling groups_inventory to load vars for managed-node1 28285 1727204266.38325: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.38336: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.38338: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.38342: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.38524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.38748: done with get_vars() 28285 1727204266.38759: done getting variables 28285 1727204266.38910: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204266.38933: done sending task result for task 0affcd87-79f5-57a1-d976-000000000061 28285 1727204266.38937: WORKER PROCESS EXITING TASK [TEST: I can disable gro and tx-tcp-segmentation and enable gso.] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:62 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.058) 0:00:07.199 ***** 28285 1727204266.38979: entering _queue_task() for managed-node1/debug 28285 1727204266.40423: worker is 1 (out of 1 available) 28285 1727204266.40437: exiting _queue_task() for managed-node1/debug 28285 1727204266.40449: done queuing things up, now waiting for results queue to drain 28285 1727204266.40451: waiting for pending results... 28285 1727204266.40851: running TaskExecutor() for managed-node1/TASK: TEST: I can disable gro and tx-tcp-segmentation and enable gso. 28285 1727204266.40950: in run() - task 0affcd87-79f5-57a1-d976-000000000062 28285 1727204266.40974: variable 'ansible_search_path' from source: unknown 28285 1727204266.41017: calling self._execute() 28285 1727204266.41113: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.41125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.41139: variable 'omit' from source: magic vars 28285 1727204266.41640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.45454: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.45610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.45661: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.45705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.45737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.45826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.45865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.45907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.45956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.46052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.46214: variable 'ansible_distribution' from source: facts 28285 1727204266.46229: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.46269: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.46278: when evaluation is False, skipping this task 28285 1727204266.46285: _execute() done 28285 1727204266.46291: dumping result to json 28285 1727204266.46299: done dumping result, returning 28285 1727204266.46312: done running TaskExecutor() for managed-node1/TASK: TEST: I can disable gro and tx-tcp-segmentation and enable gso. [0affcd87-79f5-57a1-d976-000000000062] 28285 1727204266.46330: sending task result for task 0affcd87-79f5-57a1-d976-000000000062 skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204266.46513: no more pending results, returning what we have 28285 1727204266.46517: results queue empty 28285 1727204266.46518: checking for any_errors_fatal 28285 1727204266.46523: done checking for any_errors_fatal 28285 1727204266.46524: checking for max_fail_percentage 28285 1727204266.46526: done checking for max_fail_percentage 28285 1727204266.46527: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.46528: done checking to see if all hosts have failed 28285 1727204266.46528: getting the remaining hosts for this loop 28285 1727204266.46530: done getting the remaining hosts for this loop 28285 1727204266.46534: getting the next task for host managed-node1 28285 1727204266.46542: done getting next task for host managed-node1 28285 1727204266.46552: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204266.46556: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.46591: getting variables 28285 1727204266.46594: in VariableManager get_vars() 28285 1727204266.46653: Calling all_inventory to load vars for managed-node1 28285 1727204266.46657: Calling groups_inventory to load vars for managed-node1 28285 1727204266.46659: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.46671: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.46674: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.46678: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.47122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.47566: done with get_vars() 28285 1727204266.47576: done getting variables 28285 1727204266.47921: done sending task result for task 0affcd87-79f5-57a1-d976-000000000062 28285 1727204266.47924: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.090) 0:00:07.290 ***** 28285 1727204266.48000: entering _queue_task() for managed-node1/include_tasks 28285 1727204266.48441: worker is 1 (out of 1 available) 28285 1727204266.48456: exiting _queue_task() for managed-node1/include_tasks 28285 1727204266.48469: done queuing things up, now waiting for results queue to drain 28285 1727204266.48471: waiting for pending results... 28285 1727204266.49134: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204266.49554: in run() - task 0affcd87-79f5-57a1-d976-00000000006a 28285 1727204266.49685: variable 'ansible_search_path' from source: unknown 28285 1727204266.49694: variable 'ansible_search_path' from source: unknown 28285 1727204266.49736: calling self._execute() 28285 1727204266.49856: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.49980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.49999: variable 'omit' from source: magic vars 28285 1727204266.50898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.54847: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.54984: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.55037: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.55083: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.55116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.55200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.55237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.55273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.55316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.55346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.55517: variable 'ansible_distribution' from source: facts 28285 1727204266.55529: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.55561: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.55572: when evaluation is False, skipping this task 28285 1727204266.55579: _execute() done 28285 1727204266.55585: dumping result to json 28285 1727204266.55594: done dumping result, returning 28285 1727204266.55609: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-57a1-d976-00000000006a] 28285 1727204266.55619: sending task result for task 0affcd87-79f5-57a1-d976-00000000006a skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.55770: no more pending results, returning what we have 28285 1727204266.55774: results queue empty 28285 1727204266.55774: checking for any_errors_fatal 28285 1727204266.55781: done checking for any_errors_fatal 28285 1727204266.55782: checking for max_fail_percentage 28285 1727204266.55784: done checking for max_fail_percentage 28285 1727204266.55785: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.55785: done checking to see if all hosts have failed 28285 1727204266.55786: getting the remaining hosts for this loop 28285 1727204266.55788: done getting the remaining hosts for this loop 28285 1727204266.55791: getting the next task for host managed-node1 28285 1727204266.55797: done getting next task for host managed-node1 28285 1727204266.55801: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204266.55804: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.55825: getting variables 28285 1727204266.55827: in VariableManager get_vars() 28285 1727204266.55880: Calling all_inventory to load vars for managed-node1 28285 1727204266.55883: Calling groups_inventory to load vars for managed-node1 28285 1727204266.55885: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.55896: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.55898: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.55902: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.56431: done sending task result for task 0affcd87-79f5-57a1-d976-00000000006a 28285 1727204266.56434: WORKER PROCESS EXITING 28285 1727204266.56473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.56668: done with get_vars() 28285 1727204266.56679: done getting variables 28285 1727204266.56759: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.088) 0:00:07.380 ***** 28285 1727204266.57051: entering _queue_task() for managed-node1/debug 28285 1727204266.57374: worker is 1 (out of 1 available) 28285 1727204266.57388: exiting _queue_task() for managed-node1/debug 28285 1727204266.57400: done queuing things up, now waiting for results queue to drain 28285 1727204266.57401: waiting for pending results... 28285 1727204266.57688: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204266.57831: in run() - task 0affcd87-79f5-57a1-d976-00000000006b 28285 1727204266.57855: variable 'ansible_search_path' from source: unknown 28285 1727204266.57862: variable 'ansible_search_path' from source: unknown 28285 1727204266.57905: calling self._execute() 28285 1727204266.58001: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.58013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.58037: variable 'omit' from source: magic vars 28285 1727204266.58524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.62851: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.62938: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.62985: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.63026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.63709: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.63859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.63954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.64058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.64109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.64187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.64510: variable 'ansible_distribution' from source: facts 28285 1727204266.64583: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.64609: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.64617: when evaluation is False, skipping this task 28285 1727204266.64624: _execute() done 28285 1727204266.64630: dumping result to json 28285 1727204266.64679: done dumping result, returning 28285 1727204266.64796: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-57a1-d976-00000000006b] 28285 1727204266.64808: sending task result for task 0affcd87-79f5-57a1-d976-00000000006b skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204266.64958: no more pending results, returning what we have 28285 1727204266.64962: results queue empty 28285 1727204266.64965: checking for any_errors_fatal 28285 1727204266.64970: done checking for any_errors_fatal 28285 1727204266.64971: checking for max_fail_percentage 28285 1727204266.64973: done checking for max_fail_percentage 28285 1727204266.64974: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.64975: done checking to see if all hosts have failed 28285 1727204266.64976: getting the remaining hosts for this loop 28285 1727204266.64977: done getting the remaining hosts for this loop 28285 1727204266.64981: getting the next task for host managed-node1 28285 1727204266.64988: done getting next task for host managed-node1 28285 1727204266.64993: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204266.64996: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.65013: getting variables 28285 1727204266.65015: in VariableManager get_vars() 28285 1727204266.65075: Calling all_inventory to load vars for managed-node1 28285 1727204266.65079: Calling groups_inventory to load vars for managed-node1 28285 1727204266.65081: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.65092: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.65095: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.65098: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.65355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.65581: done with get_vars() 28285 1727204266.65593: done getting variables 28285 1727204266.65941: done sending task result for task 0affcd87-79f5-57a1-d976-00000000006b 28285 1727204266.65944: WORKER PROCESS EXITING 28285 1727204266.65987: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.089) 0:00:07.470 ***** 28285 1727204266.66028: entering _queue_task() for managed-node1/fail 28285 1727204266.66817: worker is 1 (out of 1 available) 28285 1727204266.66829: exiting _queue_task() for managed-node1/fail 28285 1727204266.66841: done queuing things up, now waiting for results queue to drain 28285 1727204266.66843: waiting for pending results... 28285 1727204266.67671: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204266.67809: in run() - task 0affcd87-79f5-57a1-d976-00000000006c 28285 1727204266.67827: variable 'ansible_search_path' from source: unknown 28285 1727204266.67834: variable 'ansible_search_path' from source: unknown 28285 1727204266.67876: calling self._execute() 28285 1727204266.67973: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.67984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.68004: variable 'omit' from source: magic vars 28285 1727204266.68496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.71130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.71232: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.71285: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.71327: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.71358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.71452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.71498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.71535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.71585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.71615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.71772: variable 'ansible_distribution' from source: facts 28285 1727204266.71784: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.71812: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.71824: when evaluation is False, skipping this task 28285 1727204266.71832: _execute() done 28285 1727204266.71842: dumping result to json 28285 1727204266.71850: done dumping result, returning 28285 1727204266.71865: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-57a1-d976-00000000006c] 28285 1727204266.71878: sending task result for task 0affcd87-79f5-57a1-d976-00000000006c skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.72037: no more pending results, returning what we have 28285 1727204266.72041: results queue empty 28285 1727204266.72042: checking for any_errors_fatal 28285 1727204266.72047: done checking for any_errors_fatal 28285 1727204266.72048: checking for max_fail_percentage 28285 1727204266.72050: done checking for max_fail_percentage 28285 1727204266.72051: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.72052: done checking to see if all hosts have failed 28285 1727204266.72053: getting the remaining hosts for this loop 28285 1727204266.72055: done getting the remaining hosts for this loop 28285 1727204266.72059: getting the next task for host managed-node1 28285 1727204266.72067: done getting next task for host managed-node1 28285 1727204266.72071: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204266.72075: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.72092: getting variables 28285 1727204266.72094: in VariableManager get_vars() 28285 1727204266.72148: Calling all_inventory to load vars for managed-node1 28285 1727204266.72152: Calling groups_inventory to load vars for managed-node1 28285 1727204266.72154: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.72167: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.72170: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.72173: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.72352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.72593: done with get_vars() 28285 1727204266.72606: done getting variables 28285 1727204266.72807: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204266.72838: done sending task result for task 0affcd87-79f5-57a1-d976-00000000006c 28285 1727204266.72842: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.068) 0:00:07.538 ***** 28285 1727204266.72858: entering _queue_task() for managed-node1/fail 28285 1727204266.73350: worker is 1 (out of 1 available) 28285 1727204266.73367: exiting _queue_task() for managed-node1/fail 28285 1727204266.73380: done queuing things up, now waiting for results queue to drain 28285 1727204266.73382: waiting for pending results... 28285 1727204266.73659: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204266.73798: in run() - task 0affcd87-79f5-57a1-d976-00000000006d 28285 1727204266.73815: variable 'ansible_search_path' from source: unknown 28285 1727204266.73825: variable 'ansible_search_path' from source: unknown 28285 1727204266.73868: calling self._execute() 28285 1727204266.73965: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.73976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.73988: variable 'omit' from source: magic vars 28285 1727204266.74452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.77053: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.77101: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.77128: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.77154: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.77181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.77236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.77257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.77283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.77309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.77320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.77426: variable 'ansible_distribution' from source: facts 28285 1727204266.77432: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.77447: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.77453: when evaluation is False, skipping this task 28285 1727204266.77455: _execute() done 28285 1727204266.77458: dumping result to json 28285 1727204266.77460: done dumping result, returning 28285 1727204266.77467: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-57a1-d976-00000000006d] 28285 1727204266.77473: sending task result for task 0affcd87-79f5-57a1-d976-00000000006d 28285 1727204266.77565: done sending task result for task 0affcd87-79f5-57a1-d976-00000000006d 28285 1727204266.77568: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.77610: no more pending results, returning what we have 28285 1727204266.77613: results queue empty 28285 1727204266.77614: checking for any_errors_fatal 28285 1727204266.77622: done checking for any_errors_fatal 28285 1727204266.77622: checking for max_fail_percentage 28285 1727204266.77624: done checking for max_fail_percentage 28285 1727204266.77625: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.77627: done checking to see if all hosts have failed 28285 1727204266.77627: getting the remaining hosts for this loop 28285 1727204266.77629: done getting the remaining hosts for this loop 28285 1727204266.77633: getting the next task for host managed-node1 28285 1727204266.77639: done getting next task for host managed-node1 28285 1727204266.77643: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204266.77646: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.77671: getting variables 28285 1727204266.77673: in VariableManager get_vars() 28285 1727204266.77723: Calling all_inventory to load vars for managed-node1 28285 1727204266.77725: Calling groups_inventory to load vars for managed-node1 28285 1727204266.77727: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.77736: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.77738: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.77741: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.77909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.78032: done with get_vars() 28285 1727204266.78040: done getting variables 28285 1727204266.78084: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.052) 0:00:07.591 ***** 28285 1727204266.78110: entering _queue_task() for managed-node1/fail 28285 1727204266.78422: worker is 1 (out of 1 available) 28285 1727204266.78471: exiting _queue_task() for managed-node1/fail 28285 1727204266.78482: done queuing things up, now waiting for results queue to drain 28285 1727204266.78483: waiting for pending results... 28285 1727204266.78681: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204266.78819: in run() - task 0affcd87-79f5-57a1-d976-00000000006e 28285 1727204266.78845: variable 'ansible_search_path' from source: unknown 28285 1727204266.78855: variable 'ansible_search_path' from source: unknown 28285 1727204266.78897: calling self._execute() 28285 1727204266.78995: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.79006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.79023: variable 'omit' from source: magic vars 28285 1727204266.79494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.81670: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.81725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.81757: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.81785: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.81807: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.81867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.81887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.81905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.81935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.81945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.82056: variable 'ansible_distribution' from source: facts 28285 1727204266.82061: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.82080: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.82083: when evaluation is False, skipping this task 28285 1727204266.82086: _execute() done 28285 1727204266.82088: dumping result to json 28285 1727204266.82090: done dumping result, returning 28285 1727204266.82098: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-57a1-d976-00000000006e] 28285 1727204266.82103: sending task result for task 0affcd87-79f5-57a1-d976-00000000006e 28285 1727204266.82196: done sending task result for task 0affcd87-79f5-57a1-d976-00000000006e 28285 1727204266.82199: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.82274: no more pending results, returning what we have 28285 1727204266.82277: results queue empty 28285 1727204266.82278: checking for any_errors_fatal 28285 1727204266.82284: done checking for any_errors_fatal 28285 1727204266.82285: checking for max_fail_percentage 28285 1727204266.82286: done checking for max_fail_percentage 28285 1727204266.82288: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.82288: done checking to see if all hosts have failed 28285 1727204266.82289: getting the remaining hosts for this loop 28285 1727204266.82291: done getting the remaining hosts for this loop 28285 1727204266.82294: getting the next task for host managed-node1 28285 1727204266.82300: done getting next task for host managed-node1 28285 1727204266.82304: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204266.82309: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.82324: getting variables 28285 1727204266.82325: in VariableManager get_vars() 28285 1727204266.82382: Calling all_inventory to load vars for managed-node1 28285 1727204266.82385: Calling groups_inventory to load vars for managed-node1 28285 1727204266.82387: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.82395: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.82397: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.82400: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.82556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.82798: done with get_vars() 28285 1727204266.82814: done getting variables 28285 1727204266.82882: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.048) 0:00:07.639 ***** 28285 1727204266.82923: entering _queue_task() for managed-node1/dnf 28285 1727204266.83201: worker is 1 (out of 1 available) 28285 1727204266.83212: exiting _queue_task() for managed-node1/dnf 28285 1727204266.83224: done queuing things up, now waiting for results queue to drain 28285 1727204266.83225: waiting for pending results... 28285 1727204266.83602: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204266.83769: in run() - task 0affcd87-79f5-57a1-d976-00000000006f 28285 1727204266.83782: variable 'ansible_search_path' from source: unknown 28285 1727204266.83785: variable 'ansible_search_path' from source: unknown 28285 1727204266.83815: calling self._execute() 28285 1727204266.83891: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.83895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.83904: variable 'omit' from source: magic vars 28285 1727204266.84221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.85916: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.85968: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.85998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.86025: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.86048: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.86108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.86127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.86151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.86181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.86192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.86301: variable 'ansible_distribution' from source: facts 28285 1727204266.86305: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.86322: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.86325: when evaluation is False, skipping this task 28285 1727204266.86328: _execute() done 28285 1727204266.86330: dumping result to json 28285 1727204266.86332: done dumping result, returning 28285 1727204266.86339: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-00000000006f] 28285 1727204266.86345: sending task result for task 0affcd87-79f5-57a1-d976-00000000006f 28285 1727204266.86442: done sending task result for task 0affcd87-79f5-57a1-d976-00000000006f 28285 1727204266.86444: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.86493: no more pending results, returning what we have 28285 1727204266.86497: results queue empty 28285 1727204266.86498: checking for any_errors_fatal 28285 1727204266.86503: done checking for any_errors_fatal 28285 1727204266.86504: checking for max_fail_percentage 28285 1727204266.86505: done checking for max_fail_percentage 28285 1727204266.86506: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.86507: done checking to see if all hosts have failed 28285 1727204266.86507: getting the remaining hosts for this loop 28285 1727204266.86509: done getting the remaining hosts for this loop 28285 1727204266.86513: getting the next task for host managed-node1 28285 1727204266.86524: done getting next task for host managed-node1 28285 1727204266.86527: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204266.86531: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.86547: getting variables 28285 1727204266.86551: in VariableManager get_vars() 28285 1727204266.86604: Calling all_inventory to load vars for managed-node1 28285 1727204266.86607: Calling groups_inventory to load vars for managed-node1 28285 1727204266.86609: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.86618: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.86621: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.86623: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.86809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.86934: done with get_vars() 28285 1727204266.86942: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204266.86999: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.041) 0:00:07.680 ***** 28285 1727204266.87022: entering _queue_task() for managed-node1/yum 28285 1727204266.87229: worker is 1 (out of 1 available) 28285 1727204266.87242: exiting _queue_task() for managed-node1/yum 28285 1727204266.87256: done queuing things up, now waiting for results queue to drain 28285 1727204266.87258: waiting for pending results... 28285 1727204266.87418: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204266.87504: in run() - task 0affcd87-79f5-57a1-d976-000000000070 28285 1727204266.87514: variable 'ansible_search_path' from source: unknown 28285 1727204266.87517: variable 'ansible_search_path' from source: unknown 28285 1727204266.87544: calling self._execute() 28285 1727204266.87614: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.87623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.87632: variable 'omit' from source: magic vars 28285 1727204266.87976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.89635: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.89726: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.89771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.89810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.89841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.89923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.89960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.90002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.90051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.90075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.90221: variable 'ansible_distribution' from source: facts 28285 1727204266.90233: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.90260: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.90270: when evaluation is False, skipping this task 28285 1727204266.90276: _execute() done 28285 1727204266.90282: dumping result to json 28285 1727204266.90289: done dumping result, returning 28285 1727204266.90300: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000070] 28285 1727204266.90310: sending task result for task 0affcd87-79f5-57a1-d976-000000000070 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.90466: no more pending results, returning what we have 28285 1727204266.90470: results queue empty 28285 1727204266.90470: checking for any_errors_fatal 28285 1727204266.90476: done checking for any_errors_fatal 28285 1727204266.90477: checking for max_fail_percentage 28285 1727204266.90478: done checking for max_fail_percentage 28285 1727204266.90479: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.90480: done checking to see if all hosts have failed 28285 1727204266.90480: getting the remaining hosts for this loop 28285 1727204266.90482: done getting the remaining hosts for this loop 28285 1727204266.90486: getting the next task for host managed-node1 28285 1727204266.90492: done getting next task for host managed-node1 28285 1727204266.90496: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204266.90499: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.90520: getting variables 28285 1727204266.90522: in VariableManager get_vars() 28285 1727204266.90582: Calling all_inventory to load vars for managed-node1 28285 1727204266.90585: Calling groups_inventory to load vars for managed-node1 28285 1727204266.90587: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.90599: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.90601: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.90604: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.90767: done sending task result for task 0affcd87-79f5-57a1-d976-000000000070 28285 1727204266.90771: WORKER PROCESS EXITING 28285 1727204266.90799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.91130: done with get_vars() 28285 1727204266.91139: done getting variables 28285 1727204266.91185: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.041) 0:00:07.722 ***** 28285 1727204266.91209: entering _queue_task() for managed-node1/fail 28285 1727204266.91425: worker is 1 (out of 1 available) 28285 1727204266.91436: exiting _queue_task() for managed-node1/fail 28285 1727204266.91447: done queuing things up, now waiting for results queue to drain 28285 1727204266.91452: waiting for pending results... 28285 1727204266.91621: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204266.91713: in run() - task 0affcd87-79f5-57a1-d976-000000000071 28285 1727204266.91723: variable 'ansible_search_path' from source: unknown 28285 1727204266.91726: variable 'ansible_search_path' from source: unknown 28285 1727204266.91756: calling self._execute() 28285 1727204266.91823: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.91826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.91835: variable 'omit' from source: magic vars 28285 1727204266.92143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.94274: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.94344: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.94388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.94428: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204266.94461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204266.94540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204266.94579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204266.94608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204266.94653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204266.94675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204266.94815: variable 'ansible_distribution' from source: facts 28285 1727204266.94828: variable 'ansible_distribution_major_version' from source: facts 28285 1727204266.94849: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204266.94857: when evaluation is False, skipping this task 28285 1727204266.94863: _execute() done 28285 1727204266.94874: dumping result to json 28285 1727204266.94882: done dumping result, returning 28285 1727204266.94893: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000071] 28285 1727204266.94903: sending task result for task 0affcd87-79f5-57a1-d976-000000000071 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204266.95050: no more pending results, returning what we have 28285 1727204266.95054: results queue empty 28285 1727204266.95055: checking for any_errors_fatal 28285 1727204266.95060: done checking for any_errors_fatal 28285 1727204266.95061: checking for max_fail_percentage 28285 1727204266.95063: done checking for max_fail_percentage 28285 1727204266.95065: checking to see if all hosts have failed and the running result is not ok 28285 1727204266.95066: done checking to see if all hosts have failed 28285 1727204266.95067: getting the remaining hosts for this loop 28285 1727204266.95068: done getting the remaining hosts for this loop 28285 1727204266.95072: getting the next task for host managed-node1 28285 1727204266.95078: done getting next task for host managed-node1 28285 1727204266.95082: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28285 1727204266.95086: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204266.95102: getting variables 28285 1727204266.95103: in VariableManager get_vars() 28285 1727204266.95155: Calling all_inventory to load vars for managed-node1 28285 1727204266.95158: Calling groups_inventory to load vars for managed-node1 28285 1727204266.95160: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204266.95173: Calling all_plugins_play to load vars for managed-node1 28285 1727204266.95175: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204266.95178: Calling groups_plugins_play to load vars for managed-node1 28285 1727204266.95390: done sending task result for task 0affcd87-79f5-57a1-d976-000000000071 28285 1727204266.95393: WORKER PROCESS EXITING 28285 1727204266.95410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204266.95632: done with get_vars() 28285 1727204266.95642: done getting variables 28285 1727204266.95707: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.045) 0:00:07.767 ***** 28285 1727204266.95740: entering _queue_task() for managed-node1/package 28285 1727204266.96013: worker is 1 (out of 1 available) 28285 1727204266.96029: exiting _queue_task() for managed-node1/package 28285 1727204266.96040: done queuing things up, now waiting for results queue to drain 28285 1727204266.96042: waiting for pending results... 28285 1727204266.96314: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 28285 1727204266.96452: in run() - task 0affcd87-79f5-57a1-d976-000000000072 28285 1727204266.96476: variable 'ansible_search_path' from source: unknown 28285 1727204266.96486: variable 'ansible_search_path' from source: unknown 28285 1727204266.96522: calling self._execute() 28285 1727204266.96617: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204266.96627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204266.96639: variable 'omit' from source: magic vars 28285 1727204266.97100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204266.99824: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204266.99909: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204266.99952: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204266.99992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.00031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.00120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.00156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.00189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.00243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.00268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.00418: variable 'ansible_distribution' from source: facts 28285 1727204267.00430: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.00462: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.00472: when evaluation is False, skipping this task 28285 1727204267.00478: _execute() done 28285 1727204267.00484: dumping result to json 28285 1727204267.00490: done dumping result, returning 28285 1727204267.00500: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-57a1-d976-000000000072] 28285 1727204267.00510: sending task result for task 0affcd87-79f5-57a1-d976-000000000072 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.00670: no more pending results, returning what we have 28285 1727204267.00673: results queue empty 28285 1727204267.00674: checking for any_errors_fatal 28285 1727204267.00680: done checking for any_errors_fatal 28285 1727204267.00681: checking for max_fail_percentage 28285 1727204267.00683: done checking for max_fail_percentage 28285 1727204267.00684: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.00685: done checking to see if all hosts have failed 28285 1727204267.00686: getting the remaining hosts for this loop 28285 1727204267.00687: done getting the remaining hosts for this loop 28285 1727204267.00692: getting the next task for host managed-node1 28285 1727204267.00698: done getting next task for host managed-node1 28285 1727204267.00702: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204267.00706: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.00725: getting variables 28285 1727204267.00727: in VariableManager get_vars() 28285 1727204267.00788: Calling all_inventory to load vars for managed-node1 28285 1727204267.00791: Calling groups_inventory to load vars for managed-node1 28285 1727204267.00794: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.00804: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.00807: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.00809: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.01004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.01234: done with get_vars() 28285 1727204267.01246: done getting variables 28285 1727204267.01415: done sending task result for task 0affcd87-79f5-57a1-d976-000000000072 28285 1727204267.01419: WORKER PROCESS EXITING 28285 1727204267.01434: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.057) 0:00:07.824 ***** 28285 1727204267.01476: entering _queue_task() for managed-node1/package 28285 1727204267.01911: worker is 1 (out of 1 available) 28285 1727204267.01923: exiting _queue_task() for managed-node1/package 28285 1727204267.01934: done queuing things up, now waiting for results queue to drain 28285 1727204267.01936: waiting for pending results... 28285 1727204267.02209: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204267.02341: in run() - task 0affcd87-79f5-57a1-d976-000000000073 28285 1727204267.02362: variable 'ansible_search_path' from source: unknown 28285 1727204267.02373: variable 'ansible_search_path' from source: unknown 28285 1727204267.02417: calling self._execute() 28285 1727204267.02509: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.02520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.02534: variable 'omit' from source: magic vars 28285 1727204267.02982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.05638: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.05718: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.05775: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.05814: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.05840: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.05923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.05960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.05997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.06043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.06069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.06223: variable 'ansible_distribution' from source: facts 28285 1727204267.06234: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.06259: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.06269: when evaluation is False, skipping this task 28285 1727204267.06277: _execute() done 28285 1727204267.06285: dumping result to json 28285 1727204267.06296: done dumping result, returning 28285 1727204267.06306: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-57a1-d976-000000000073] 28285 1727204267.06314: sending task result for task 0affcd87-79f5-57a1-d976-000000000073 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.06468: no more pending results, returning what we have 28285 1727204267.06472: results queue empty 28285 1727204267.06473: checking for any_errors_fatal 28285 1727204267.06479: done checking for any_errors_fatal 28285 1727204267.06479: checking for max_fail_percentage 28285 1727204267.06481: done checking for max_fail_percentage 28285 1727204267.06482: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.06483: done checking to see if all hosts have failed 28285 1727204267.06484: getting the remaining hosts for this loop 28285 1727204267.06486: done getting the remaining hosts for this loop 28285 1727204267.06490: getting the next task for host managed-node1 28285 1727204267.06497: done getting next task for host managed-node1 28285 1727204267.06501: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204267.06505: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.06521: getting variables 28285 1727204267.06523: in VariableManager get_vars() 28285 1727204267.06586: Calling all_inventory to load vars for managed-node1 28285 1727204267.06589: Calling groups_inventory to load vars for managed-node1 28285 1727204267.06592: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.06602: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.06605: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.06607: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.06842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.07073: done with get_vars() 28285 1727204267.07199: done getting variables 28285 1727204267.07233: done sending task result for task 0affcd87-79f5-57a1-d976-000000000073 28285 1727204267.07236: WORKER PROCESS EXITING 28285 1727204267.07318: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.058) 0:00:07.883 ***** 28285 1727204267.07359: entering _queue_task() for managed-node1/package 28285 1727204267.07770: worker is 1 (out of 1 available) 28285 1727204267.07782: exiting _queue_task() for managed-node1/package 28285 1727204267.07793: done queuing things up, now waiting for results queue to drain 28285 1727204267.07794: waiting for pending results... 28285 1727204267.08061: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204267.08198: in run() - task 0affcd87-79f5-57a1-d976-000000000074 28285 1727204267.08214: variable 'ansible_search_path' from source: unknown 28285 1727204267.08222: variable 'ansible_search_path' from source: unknown 28285 1727204267.08266: calling self._execute() 28285 1727204267.08358: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.08371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.08384: variable 'omit' from source: magic vars 28285 1727204267.08853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.11460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.11557: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.11610: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.11661: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.11696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.11793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.11831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.11869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.11919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.11940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.12102: variable 'ansible_distribution' from source: facts 28285 1727204267.12114: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.12138: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.12150: when evaluation is False, skipping this task 28285 1727204267.12157: _execute() done 28285 1727204267.12165: dumping result to json 28285 1727204267.12173: done dumping result, returning 28285 1727204267.12186: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-57a1-d976-000000000074] 28285 1727204267.12199: sending task result for task 0affcd87-79f5-57a1-d976-000000000074 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.12337: no more pending results, returning what we have 28285 1727204267.12341: results queue empty 28285 1727204267.12341: checking for any_errors_fatal 28285 1727204267.12350: done checking for any_errors_fatal 28285 1727204267.12351: checking for max_fail_percentage 28285 1727204267.12352: done checking for max_fail_percentage 28285 1727204267.12353: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.12354: done checking to see if all hosts have failed 28285 1727204267.12355: getting the remaining hosts for this loop 28285 1727204267.12357: done getting the remaining hosts for this loop 28285 1727204267.12361: getting the next task for host managed-node1 28285 1727204267.12369: done getting next task for host managed-node1 28285 1727204267.12373: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204267.12376: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.12391: getting variables 28285 1727204267.12393: in VariableManager get_vars() 28285 1727204267.12446: Calling all_inventory to load vars for managed-node1 28285 1727204267.12451: Calling groups_inventory to load vars for managed-node1 28285 1727204267.12453: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.12465: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.12467: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.12470: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.12657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.12894: done with get_vars() 28285 1727204267.12906: done getting variables 28285 1727204267.12970: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.057) 0:00:07.941 ***** 28285 1727204267.13125: entering _queue_task() for managed-node1/service 28285 1727204267.13169: done sending task result for task 0affcd87-79f5-57a1-d976-000000000074 28285 1727204267.13173: WORKER PROCESS EXITING 28285 1727204267.13599: worker is 1 (out of 1 available) 28285 1727204267.13612: exiting _queue_task() for managed-node1/service 28285 1727204267.13623: done queuing things up, now waiting for results queue to drain 28285 1727204267.13624: waiting for pending results... 28285 1727204267.13896: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204267.14030: in run() - task 0affcd87-79f5-57a1-d976-000000000075 28285 1727204267.14047: variable 'ansible_search_path' from source: unknown 28285 1727204267.14057: variable 'ansible_search_path' from source: unknown 28285 1727204267.14109: calling self._execute() 28285 1727204267.14193: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.14209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.14222: variable 'omit' from source: magic vars 28285 1727204267.14679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.17287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.17374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.17418: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.17471: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.17503: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.17597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.17624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.17657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.17701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.17716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.17867: variable 'ansible_distribution' from source: facts 28285 1727204267.17882: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.17908: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.17915: when evaluation is False, skipping this task 28285 1727204267.17921: _execute() done 28285 1727204267.17927: dumping result to json 28285 1727204267.17934: done dumping result, returning 28285 1727204267.17944: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000075] 28285 1727204267.17957: sending task result for task 0affcd87-79f5-57a1-d976-000000000075 28285 1727204267.18076: done sending task result for task 0affcd87-79f5-57a1-d976-000000000075 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.18122: no more pending results, returning what we have 28285 1727204267.18125: results queue empty 28285 1727204267.18126: checking for any_errors_fatal 28285 1727204267.18134: done checking for any_errors_fatal 28285 1727204267.18134: checking for max_fail_percentage 28285 1727204267.18136: done checking for max_fail_percentage 28285 1727204267.18137: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.18138: done checking to see if all hosts have failed 28285 1727204267.18139: getting the remaining hosts for this loop 28285 1727204267.18141: done getting the remaining hosts for this loop 28285 1727204267.18145: getting the next task for host managed-node1 28285 1727204267.18155: done getting next task for host managed-node1 28285 1727204267.18160: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204267.18163: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.18185: getting variables 28285 1727204267.18187: in VariableManager get_vars() 28285 1727204267.18242: Calling all_inventory to load vars for managed-node1 28285 1727204267.18245: Calling groups_inventory to load vars for managed-node1 28285 1727204267.18250: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.18261: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.18265: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.18268: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.18522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.18754: done with get_vars() 28285 1727204267.18767: done getting variables 28285 1727204267.18915: WORKER PROCESS EXITING 28285 1727204267.18954: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.058) 0:00:07.999 ***** 28285 1727204267.18992: entering _queue_task() for managed-node1/service 28285 1727204267.19434: worker is 1 (out of 1 available) 28285 1727204267.19453: exiting _queue_task() for managed-node1/service 28285 1727204267.19468: done queuing things up, now waiting for results queue to drain 28285 1727204267.19470: waiting for pending results... 28285 1727204267.19738: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204267.19884: in run() - task 0affcd87-79f5-57a1-d976-000000000076 28285 1727204267.19906: variable 'ansible_search_path' from source: unknown 28285 1727204267.19915: variable 'ansible_search_path' from source: unknown 28285 1727204267.19956: calling self._execute() 28285 1727204267.20045: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.20059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.20074: variable 'omit' from source: magic vars 28285 1727204267.20526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.23121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.23219: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.23275: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.23319: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.23353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.23457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.23498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.23532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.23582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.23608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.23774: variable 'ansible_distribution' from source: facts 28285 1727204267.23787: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.23813: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.23827: when evaluation is False, skipping this task 28285 1727204267.23837: _execute() done 28285 1727204267.23844: dumping result to json 28285 1727204267.23856: done dumping result, returning 28285 1727204267.23872: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-57a1-d976-000000000076] 28285 1727204267.23884: sending task result for task 0affcd87-79f5-57a1-d976-000000000076 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204267.24046: no more pending results, returning what we have 28285 1727204267.24052: results queue empty 28285 1727204267.24053: checking for any_errors_fatal 28285 1727204267.24060: done checking for any_errors_fatal 28285 1727204267.24061: checking for max_fail_percentage 28285 1727204267.24065: done checking for max_fail_percentage 28285 1727204267.24066: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.24067: done checking to see if all hosts have failed 28285 1727204267.24068: getting the remaining hosts for this loop 28285 1727204267.24070: done getting the remaining hosts for this loop 28285 1727204267.24074: getting the next task for host managed-node1 28285 1727204267.24081: done getting next task for host managed-node1 28285 1727204267.24086: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204267.24089: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.24113: getting variables 28285 1727204267.24116: in VariableManager get_vars() 28285 1727204267.24184: Calling all_inventory to load vars for managed-node1 28285 1727204267.24187: Calling groups_inventory to load vars for managed-node1 28285 1727204267.24204: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.24221: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.24225: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.24233: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.24423: done sending task result for task 0affcd87-79f5-57a1-d976-000000000076 28285 1727204267.24426: WORKER PROCESS EXITING 28285 1727204267.24453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.24635: done with get_vars() 28285 1727204267.24644: done getting variables 28285 1727204267.24690: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.057) 0:00:08.057 ***** 28285 1727204267.24715: entering _queue_task() for managed-node1/service 28285 1727204267.24924: worker is 1 (out of 1 available) 28285 1727204267.24934: exiting _queue_task() for managed-node1/service 28285 1727204267.24947: done queuing things up, now waiting for results queue to drain 28285 1727204267.24952: waiting for pending results... 28285 1727204267.25123: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204267.25210: in run() - task 0affcd87-79f5-57a1-d976-000000000077 28285 1727204267.25217: variable 'ansible_search_path' from source: unknown 28285 1727204267.25220: variable 'ansible_search_path' from source: unknown 28285 1727204267.25253: calling self._execute() 28285 1727204267.25327: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.25330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.25339: variable 'omit' from source: magic vars 28285 1727204267.25669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.28210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.28276: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.28315: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.28365: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.28395: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.28571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.28575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.28578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.28581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.28660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.28745: variable 'ansible_distribution' from source: facts 28285 1727204267.28753: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.28781: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.28786: when evaluation is False, skipping this task 28285 1727204267.28789: _execute() done 28285 1727204267.28791: dumping result to json 28285 1727204267.28793: done dumping result, returning 28285 1727204267.28801: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-57a1-d976-000000000077] 28285 1727204267.28807: sending task result for task 0affcd87-79f5-57a1-d976-000000000077 28285 1727204267.28906: done sending task result for task 0affcd87-79f5-57a1-d976-000000000077 28285 1727204267.28909: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.28952: no more pending results, returning what we have 28285 1727204267.28955: results queue empty 28285 1727204267.28956: checking for any_errors_fatal 28285 1727204267.28963: done checking for any_errors_fatal 28285 1727204267.28966: checking for max_fail_percentage 28285 1727204267.28968: done checking for max_fail_percentage 28285 1727204267.28969: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.28970: done checking to see if all hosts have failed 28285 1727204267.28970: getting the remaining hosts for this loop 28285 1727204267.28972: done getting the remaining hosts for this loop 28285 1727204267.28976: getting the next task for host managed-node1 28285 1727204267.28982: done getting next task for host managed-node1 28285 1727204267.28986: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204267.28989: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.29004: getting variables 28285 1727204267.29006: in VariableManager get_vars() 28285 1727204267.29059: Calling all_inventory to load vars for managed-node1 28285 1727204267.29062: Calling groups_inventory to load vars for managed-node1 28285 1727204267.29066: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.29075: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.29077: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.29080: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.29367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.30088: done with get_vars() 28285 1727204267.30099: done getting variables 28285 1727204267.30169: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.054) 0:00:08.112 ***** 28285 1727204267.30202: entering _queue_task() for managed-node1/service 28285 1727204267.30500: worker is 1 (out of 1 available) 28285 1727204267.30512: exiting _queue_task() for managed-node1/service 28285 1727204267.30526: done queuing things up, now waiting for results queue to drain 28285 1727204267.30528: waiting for pending results... 28285 1727204267.30909: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204267.31042: in run() - task 0affcd87-79f5-57a1-d976-000000000078 28285 1727204267.31104: variable 'ansible_search_path' from source: unknown 28285 1727204267.31114: variable 'ansible_search_path' from source: unknown 28285 1727204267.31157: calling self._execute() 28285 1727204267.31253: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.31268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.31282: variable 'omit' from source: magic vars 28285 1727204267.31759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.34678: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.34778: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.34836: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.34879: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.34913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.35009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.35068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.35100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.35156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.35180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.35359: variable 'ansible_distribution' from source: facts 28285 1727204267.35378: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.35400: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.35407: when evaluation is False, skipping this task 28285 1727204267.35412: _execute() done 28285 1727204267.35417: dumping result to json 28285 1727204267.35644: done dumping result, returning 28285 1727204267.35658: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-57a1-d976-000000000078] 28285 1727204267.35674: sending task result for task 0affcd87-79f5-57a1-d976-000000000078 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204267.35835: no more pending results, returning what we have 28285 1727204267.35839: results queue empty 28285 1727204267.35840: checking for any_errors_fatal 28285 1727204267.35850: done checking for any_errors_fatal 28285 1727204267.35851: checking for max_fail_percentage 28285 1727204267.35853: done checking for max_fail_percentage 28285 1727204267.35854: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.35855: done checking to see if all hosts have failed 28285 1727204267.35856: getting the remaining hosts for this loop 28285 1727204267.35858: done getting the remaining hosts for this loop 28285 1727204267.35862: getting the next task for host managed-node1 28285 1727204267.35895: done getting next task for host managed-node1 28285 1727204267.35900: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204267.35903: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.35921: getting variables 28285 1727204267.35923: in VariableManager get_vars() 28285 1727204267.35981: Calling all_inventory to load vars for managed-node1 28285 1727204267.35984: Calling groups_inventory to load vars for managed-node1 28285 1727204267.35986: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.35999: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.36002: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.36005: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.36197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.36484: done with get_vars() 28285 1727204267.36497: done getting variables 28285 1727204267.36588: done sending task result for task 0affcd87-79f5-57a1-d976-000000000078 28285 1727204267.36591: WORKER PROCESS EXITING 28285 1727204267.36613: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.064) 0:00:08.176 ***** 28285 1727204267.36655: entering _queue_task() for managed-node1/copy 28285 1727204267.37168: worker is 1 (out of 1 available) 28285 1727204267.37181: exiting _queue_task() for managed-node1/copy 28285 1727204267.37193: done queuing things up, now waiting for results queue to drain 28285 1727204267.37195: waiting for pending results... 28285 1727204267.37477: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204267.37613: in run() - task 0affcd87-79f5-57a1-d976-000000000079 28285 1727204267.37623: variable 'ansible_search_path' from source: unknown 28285 1727204267.37627: variable 'ansible_search_path' from source: unknown 28285 1727204267.37660: calling self._execute() 28285 1727204267.37720: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.37724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.37733: variable 'omit' from source: magic vars 28285 1727204267.38046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.40184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.40230: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.40259: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.40285: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.40306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.40362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.40384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.40401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.40431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.40442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.40541: variable 'ansible_distribution' from source: facts 28285 1727204267.40545: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.40562: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.40566: when evaluation is False, skipping this task 28285 1727204267.40569: _execute() done 28285 1727204267.40571: dumping result to json 28285 1727204267.40573: done dumping result, returning 28285 1727204267.40580: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-57a1-d976-000000000079] 28285 1727204267.40587: sending task result for task 0affcd87-79f5-57a1-d976-000000000079 28285 1727204267.40675: done sending task result for task 0affcd87-79f5-57a1-d976-000000000079 28285 1727204267.40678: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.40732: no more pending results, returning what we have 28285 1727204267.40736: results queue empty 28285 1727204267.40736: checking for any_errors_fatal 28285 1727204267.40742: done checking for any_errors_fatal 28285 1727204267.40743: checking for max_fail_percentage 28285 1727204267.40745: done checking for max_fail_percentage 28285 1727204267.40746: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.40747: done checking to see if all hosts have failed 28285 1727204267.40747: getting the remaining hosts for this loop 28285 1727204267.40751: done getting the remaining hosts for this loop 28285 1727204267.40755: getting the next task for host managed-node1 28285 1727204267.40760: done getting next task for host managed-node1 28285 1727204267.40766: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204267.40769: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.40784: getting variables 28285 1727204267.40788: in VariableManager get_vars() 28285 1727204267.40834: Calling all_inventory to load vars for managed-node1 28285 1727204267.40837: Calling groups_inventory to load vars for managed-node1 28285 1727204267.40839: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.40850: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.40852: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.40855: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.41009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.41132: done with get_vars() 28285 1727204267.41139: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.045) 0:00:08.222 ***** 28285 1727204267.41201: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204267.41392: worker is 1 (out of 1 available) 28285 1727204267.41403: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204267.41414: done queuing things up, now waiting for results queue to drain 28285 1727204267.41416: waiting for pending results... 28285 1727204267.41577: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204267.41657: in run() - task 0affcd87-79f5-57a1-d976-00000000007a 28285 1727204267.41670: variable 'ansible_search_path' from source: unknown 28285 1727204267.41674: variable 'ansible_search_path' from source: unknown 28285 1727204267.41701: calling self._execute() 28285 1727204267.41763: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.41770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.41778: variable 'omit' from source: magic vars 28285 1727204267.42074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.43678: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.43733: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.43761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.43788: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.43811: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.43869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.43888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.43906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.43939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.43951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.44054: variable 'ansible_distribution' from source: facts 28285 1727204267.44060: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.44076: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.44079: when evaluation is False, skipping this task 28285 1727204267.44082: _execute() done 28285 1727204267.44084: dumping result to json 28285 1727204267.44087: done dumping result, returning 28285 1727204267.44093: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-57a1-d976-00000000007a] 28285 1727204267.44099: sending task result for task 0affcd87-79f5-57a1-d976-00000000007a 28285 1727204267.44194: done sending task result for task 0affcd87-79f5-57a1-d976-00000000007a 28285 1727204267.44197: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.44245: no more pending results, returning what we have 28285 1727204267.44251: results queue empty 28285 1727204267.44252: checking for any_errors_fatal 28285 1727204267.44257: done checking for any_errors_fatal 28285 1727204267.44258: checking for max_fail_percentage 28285 1727204267.44259: done checking for max_fail_percentage 28285 1727204267.44260: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.44261: done checking to see if all hosts have failed 28285 1727204267.44262: getting the remaining hosts for this loop 28285 1727204267.44270: done getting the remaining hosts for this loop 28285 1727204267.44274: getting the next task for host managed-node1 28285 1727204267.44280: done getting next task for host managed-node1 28285 1727204267.44284: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204267.44287: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.44301: getting variables 28285 1727204267.44303: in VariableManager get_vars() 28285 1727204267.44353: Calling all_inventory to load vars for managed-node1 28285 1727204267.44356: Calling groups_inventory to load vars for managed-node1 28285 1727204267.44359: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.44369: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.44371: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.44373: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.44493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.44614: done with get_vars() 28285 1727204267.44622: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.034) 0:00:08.256 ***** 28285 1727204267.44683: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204267.44874: worker is 1 (out of 1 available) 28285 1727204267.44886: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204267.44897: done queuing things up, now waiting for results queue to drain 28285 1727204267.44899: waiting for pending results... 28285 1727204267.45060: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204267.45142: in run() - task 0affcd87-79f5-57a1-d976-00000000007b 28285 1727204267.45154: variable 'ansible_search_path' from source: unknown 28285 1727204267.45158: variable 'ansible_search_path' from source: unknown 28285 1727204267.45186: calling self._execute() 28285 1727204267.45243: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.45247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.45256: variable 'omit' from source: magic vars 28285 1727204267.45555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.47133: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.47180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.47206: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.47232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.47254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.47310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.47329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.47347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.47377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.47392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.47487: variable 'ansible_distribution' from source: facts 28285 1727204267.47491: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.47507: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.47512: when evaluation is False, skipping this task 28285 1727204267.47514: _execute() done 28285 1727204267.47517: dumping result to json 28285 1727204267.47519: done dumping result, returning 28285 1727204267.47524: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-57a1-d976-00000000007b] 28285 1727204267.47533: sending task result for task 0affcd87-79f5-57a1-d976-00000000007b 28285 1727204267.47615: done sending task result for task 0affcd87-79f5-57a1-d976-00000000007b 28285 1727204267.47618: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.47677: no more pending results, returning what we have 28285 1727204267.47680: results queue empty 28285 1727204267.47681: checking for any_errors_fatal 28285 1727204267.47687: done checking for any_errors_fatal 28285 1727204267.47688: checking for max_fail_percentage 28285 1727204267.47690: done checking for max_fail_percentage 28285 1727204267.47691: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.47691: done checking to see if all hosts have failed 28285 1727204267.47692: getting the remaining hosts for this loop 28285 1727204267.47694: done getting the remaining hosts for this loop 28285 1727204267.47697: getting the next task for host managed-node1 28285 1727204267.47703: done getting next task for host managed-node1 28285 1727204267.47707: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204267.47709: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.47724: getting variables 28285 1727204267.47725: in VariableManager get_vars() 28285 1727204267.47778: Calling all_inventory to load vars for managed-node1 28285 1727204267.47781: Calling groups_inventory to load vars for managed-node1 28285 1727204267.47782: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.47788: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.47789: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.47791: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.47936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.48061: done with get_vars() 28285 1727204267.48070: done getting variables 28285 1727204267.48109: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.034) 0:00:08.291 ***** 28285 1727204267.48130: entering _queue_task() for managed-node1/debug 28285 1727204267.48311: worker is 1 (out of 1 available) 28285 1727204267.48324: exiting _queue_task() for managed-node1/debug 28285 1727204267.48336: done queuing things up, now waiting for results queue to drain 28285 1727204267.48337: waiting for pending results... 28285 1727204267.48495: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204267.48576: in run() - task 0affcd87-79f5-57a1-d976-00000000007c 28285 1727204267.48584: variable 'ansible_search_path' from source: unknown 28285 1727204267.48587: variable 'ansible_search_path' from source: unknown 28285 1727204267.48615: calling self._execute() 28285 1727204267.48678: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.48682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.48689: variable 'omit' from source: magic vars 28285 1727204267.48980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.50512: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.50733: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.50761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.50789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.50809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.50866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.50887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.50906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.50932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.50943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.51039: variable 'ansible_distribution' from source: facts 28285 1727204267.51043: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.51060: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.51063: when evaluation is False, skipping this task 28285 1727204267.51067: _execute() done 28285 1727204267.51069: dumping result to json 28285 1727204267.51072: done dumping result, returning 28285 1727204267.51078: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-57a1-d976-00000000007c] 28285 1727204267.51083: sending task result for task 0affcd87-79f5-57a1-d976-00000000007c 28285 1727204267.51162: done sending task result for task 0affcd87-79f5-57a1-d976-00000000007c 28285 1727204267.51166: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204267.51214: no more pending results, returning what we have 28285 1727204267.51217: results queue empty 28285 1727204267.51218: checking for any_errors_fatal 28285 1727204267.51223: done checking for any_errors_fatal 28285 1727204267.51223: checking for max_fail_percentage 28285 1727204267.51225: done checking for max_fail_percentage 28285 1727204267.51227: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.51228: done checking to see if all hosts have failed 28285 1727204267.51228: getting the remaining hosts for this loop 28285 1727204267.51230: done getting the remaining hosts for this loop 28285 1727204267.51233: getting the next task for host managed-node1 28285 1727204267.51240: done getting next task for host managed-node1 28285 1727204267.51243: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204267.51246: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.51262: getting variables 28285 1727204267.51266: in VariableManager get_vars() 28285 1727204267.51313: Calling all_inventory to load vars for managed-node1 28285 1727204267.51316: Calling groups_inventory to load vars for managed-node1 28285 1727204267.51318: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.51326: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.51328: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.51330: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.51447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.51575: done with get_vars() 28285 1727204267.51582: done getting variables 28285 1727204267.51623: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.035) 0:00:08.326 ***** 28285 1727204267.51646: entering _queue_task() for managed-node1/debug 28285 1727204267.51828: worker is 1 (out of 1 available) 28285 1727204267.51842: exiting _queue_task() for managed-node1/debug 28285 1727204267.51857: done queuing things up, now waiting for results queue to drain 28285 1727204267.51859: waiting for pending results... 28285 1727204267.52020: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204267.52098: in run() - task 0affcd87-79f5-57a1-d976-00000000007d 28285 1727204267.52110: variable 'ansible_search_path' from source: unknown 28285 1727204267.52114: variable 'ansible_search_path' from source: unknown 28285 1727204267.52140: calling self._execute() 28285 1727204267.52207: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.52210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.52219: variable 'omit' from source: magic vars 28285 1727204267.52543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.54911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.54953: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.54991: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.55017: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.55038: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.55095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.55115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.55134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.55169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.55183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.55286: variable 'ansible_distribution' from source: facts 28285 1727204267.55289: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.55306: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.55309: when evaluation is False, skipping this task 28285 1727204267.55311: _execute() done 28285 1727204267.55314: dumping result to json 28285 1727204267.55316: done dumping result, returning 28285 1727204267.55323: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-57a1-d976-00000000007d] 28285 1727204267.55331: sending task result for task 0affcd87-79f5-57a1-d976-00000000007d 28285 1727204267.55410: done sending task result for task 0affcd87-79f5-57a1-d976-00000000007d 28285 1727204267.55413: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204267.55456: no more pending results, returning what we have 28285 1727204267.55459: results queue empty 28285 1727204267.55460: checking for any_errors_fatal 28285 1727204267.55466: done checking for any_errors_fatal 28285 1727204267.55467: checking for max_fail_percentage 28285 1727204267.55469: done checking for max_fail_percentage 28285 1727204267.55470: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.55471: done checking to see if all hosts have failed 28285 1727204267.55471: getting the remaining hosts for this loop 28285 1727204267.55473: done getting the remaining hosts for this loop 28285 1727204267.55476: getting the next task for host managed-node1 28285 1727204267.55482: done getting next task for host managed-node1 28285 1727204267.55486: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204267.55489: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.55504: getting variables 28285 1727204267.55505: in VariableManager get_vars() 28285 1727204267.55555: Calling all_inventory to load vars for managed-node1 28285 1727204267.55559: Calling groups_inventory to load vars for managed-node1 28285 1727204267.55561: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.55572: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.55574: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.55577: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.55746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.55870: done with get_vars() 28285 1727204267.55877: done getting variables 28285 1727204267.55917: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.042) 0:00:08.369 ***** 28285 1727204267.55939: entering _queue_task() for managed-node1/debug 28285 1727204267.56138: worker is 1 (out of 1 available) 28285 1727204267.56153: exiting _queue_task() for managed-node1/debug 28285 1727204267.56169: done queuing things up, now waiting for results queue to drain 28285 1727204267.56171: waiting for pending results... 28285 1727204267.56389: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204267.56517: in run() - task 0affcd87-79f5-57a1-d976-00000000007e 28285 1727204267.56535: variable 'ansible_search_path' from source: unknown 28285 1727204267.56542: variable 'ansible_search_path' from source: unknown 28285 1727204267.56582: calling self._execute() 28285 1727204267.56674: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.56686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.56699: variable 'omit' from source: magic vars 28285 1727204267.57111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.59767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.59836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.59882: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.59933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.59964: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.60044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.60082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.60114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.60160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.60182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.60329: variable 'ansible_distribution' from source: facts 28285 1727204267.60339: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.60361: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.60371: when evaluation is False, skipping this task 28285 1727204267.60378: _execute() done 28285 1727204267.60384: dumping result to json 28285 1727204267.60392: done dumping result, returning 28285 1727204267.60407: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-57a1-d976-00000000007e] 28285 1727204267.60419: sending task result for task 0affcd87-79f5-57a1-d976-00000000007e skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204267.60566: no more pending results, returning what we have 28285 1727204267.60570: results queue empty 28285 1727204267.60571: checking for any_errors_fatal 28285 1727204267.60576: done checking for any_errors_fatal 28285 1727204267.60577: checking for max_fail_percentage 28285 1727204267.60579: done checking for max_fail_percentage 28285 1727204267.60580: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.60581: done checking to see if all hosts have failed 28285 1727204267.60582: getting the remaining hosts for this loop 28285 1727204267.60584: done getting the remaining hosts for this loop 28285 1727204267.60588: getting the next task for host managed-node1 28285 1727204267.60595: done getting next task for host managed-node1 28285 1727204267.60600: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204267.60603: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.60620: getting variables 28285 1727204267.60622: in VariableManager get_vars() 28285 1727204267.60679: Calling all_inventory to load vars for managed-node1 28285 1727204267.60682: Calling groups_inventory to load vars for managed-node1 28285 1727204267.60684: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.60695: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.60697: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.60700: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.60876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.61091: done with get_vars() 28285 1727204267.61102: done getting variables 28285 1727204267.61444: done sending task result for task 0affcd87-79f5-57a1-d976-00000000007e 28285 1727204267.61448: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.055) 0:00:08.424 ***** 28285 1727204267.61485: entering _queue_task() for managed-node1/ping 28285 1727204267.61735: worker is 1 (out of 1 available) 28285 1727204267.61748: exiting _queue_task() for managed-node1/ping 28285 1727204267.61760: done queuing things up, now waiting for results queue to drain 28285 1727204267.61761: waiting for pending results... 28285 1727204267.62016: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204267.62144: in run() - task 0affcd87-79f5-57a1-d976-00000000007f 28285 1727204267.62167: variable 'ansible_search_path' from source: unknown 28285 1727204267.62174: variable 'ansible_search_path' from source: unknown 28285 1727204267.62217: calling self._execute() 28285 1727204267.62298: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.62310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.62325: variable 'omit' from source: magic vars 28285 1727204267.62745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.65446: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.65523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.65568: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.65609: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.65640: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.65723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.65754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.65792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.65836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.65853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.66003: variable 'ansible_distribution' from source: facts 28285 1727204267.66014: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.66037: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.66044: when evaluation is False, skipping this task 28285 1727204267.66050: _execute() done 28285 1727204267.66056: dumping result to json 28285 1727204267.66063: done dumping result, returning 28285 1727204267.66077: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-57a1-d976-00000000007f] 28285 1727204267.66086: sending task result for task 0affcd87-79f5-57a1-d976-00000000007f 28285 1727204267.66186: done sending task result for task 0affcd87-79f5-57a1-d976-00000000007f 28285 1727204267.66193: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.66251: no more pending results, returning what we have 28285 1727204267.66255: results queue empty 28285 1727204267.66255: checking for any_errors_fatal 28285 1727204267.66261: done checking for any_errors_fatal 28285 1727204267.66262: checking for max_fail_percentage 28285 1727204267.66265: done checking for max_fail_percentage 28285 1727204267.66267: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.66267: done checking to see if all hosts have failed 28285 1727204267.66268: getting the remaining hosts for this loop 28285 1727204267.66270: done getting the remaining hosts for this loop 28285 1727204267.66274: getting the next task for host managed-node1 28285 1727204267.66284: done getting next task for host managed-node1 28285 1727204267.66286: ^ task is: TASK: meta (role_complete) 28285 1727204267.66289: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.66306: getting variables 28285 1727204267.66308: in VariableManager get_vars() 28285 1727204267.66365: Calling all_inventory to load vars for managed-node1 28285 1727204267.66368: Calling groups_inventory to load vars for managed-node1 28285 1727204267.66371: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.66382: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.66384: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.66387: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.66572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.66837: done with get_vars() 28285 1727204267.66847: done getting variables 28285 1727204267.67189: done queuing things up, now waiting for results queue to drain 28285 1727204267.67191: results queue empty 28285 1727204267.67192: checking for any_errors_fatal 28285 1727204267.67194: done checking for any_errors_fatal 28285 1727204267.67195: checking for max_fail_percentage 28285 1727204267.67196: done checking for max_fail_percentage 28285 1727204267.67197: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.67198: done checking to see if all hosts have failed 28285 1727204267.67198: getting the remaining hosts for this loop 28285 1727204267.67199: done getting the remaining hosts for this loop 28285 1727204267.67202: getting the next task for host managed-node1 28285 1727204267.67206: done getting next task for host managed-node1 28285 1727204267.67208: ^ task is: TASK: Get current device features 28285 1727204267.67210: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.67212: getting variables 28285 1727204267.67213: in VariableManager get_vars() 28285 1727204267.67234: Calling all_inventory to load vars for managed-node1 28285 1727204267.67236: Calling groups_inventory to load vars for managed-node1 28285 1727204267.67238: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.67243: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.67245: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.67248: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.67385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.67584: done with get_vars() 28285 1727204267.67593: done getting variables 28285 1727204267.67632: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get current device features] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:82 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.061) 0:00:08.486 ***** 28285 1727204267.67659: entering _queue_task() for managed-node1/command 28285 1727204267.67909: worker is 1 (out of 1 available) 28285 1727204267.67921: exiting _queue_task() for managed-node1/command 28285 1727204267.67933: done queuing things up, now waiting for results queue to drain 28285 1727204267.67935: waiting for pending results... 28285 1727204267.68189: running TaskExecutor() for managed-node1/TASK: Get current device features 28285 1727204267.68286: in run() - task 0affcd87-79f5-57a1-d976-0000000000af 28285 1727204267.68303: variable 'ansible_search_path' from source: unknown 28285 1727204267.68340: calling self._execute() 28285 1727204267.68429: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.68439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.68454: variable 'omit' from source: magic vars 28285 1727204267.68882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.73714: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.73860: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.73941: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.74044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.74144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.74341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.74379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.74410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.74488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.74570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.74736: variable 'ansible_distribution' from source: facts 28285 1727204267.74749: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.74780: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.74789: when evaluation is False, skipping this task 28285 1727204267.74796: _execute() done 28285 1727204267.74802: dumping result to json 28285 1727204267.74810: done dumping result, returning 28285 1727204267.74820: done running TaskExecutor() for managed-node1/TASK: Get current device features [0affcd87-79f5-57a1-d976-0000000000af] 28285 1727204267.74831: sending task result for task 0affcd87-79f5-57a1-d976-0000000000af skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.74975: no more pending results, returning what we have 28285 1727204267.74979: results queue empty 28285 1727204267.74979: checking for any_errors_fatal 28285 1727204267.74981: done checking for any_errors_fatal 28285 1727204267.74982: checking for max_fail_percentage 28285 1727204267.74984: done checking for max_fail_percentage 28285 1727204267.74985: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.74985: done checking to see if all hosts have failed 28285 1727204267.74986: getting the remaining hosts for this loop 28285 1727204267.74988: done getting the remaining hosts for this loop 28285 1727204267.74992: getting the next task for host managed-node1 28285 1727204267.74998: done getting next task for host managed-node1 28285 1727204267.75001: ^ task is: TASK: Show ethtool_features 28285 1727204267.75003: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.75006: getting variables 28285 1727204267.75008: in VariableManager get_vars() 28285 1727204267.75062: Calling all_inventory to load vars for managed-node1 28285 1727204267.75069: Calling groups_inventory to load vars for managed-node1 28285 1727204267.75071: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.75083: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.75086: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.75089: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.75268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.75980: done with get_vars() 28285 1727204267.75990: done getting variables 28285 1727204267.76134: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000af 28285 1727204267.76137: WORKER PROCESS EXITING 28285 1727204267.76176: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ethtool_features] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:86 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.085) 0:00:08.572 ***** 28285 1727204267.76203: entering _queue_task() for managed-node1/debug 28285 1727204267.76449: worker is 1 (out of 1 available) 28285 1727204267.76461: exiting _queue_task() for managed-node1/debug 28285 1727204267.76473: done queuing things up, now waiting for results queue to drain 28285 1727204267.76474: waiting for pending results... 28285 1727204267.76724: running TaskExecutor() for managed-node1/TASK: Show ethtool_features 28285 1727204267.76825: in run() - task 0affcd87-79f5-57a1-d976-0000000000b0 28285 1727204267.76843: variable 'ansible_search_path' from source: unknown 28285 1727204267.76885: calling self._execute() 28285 1727204267.76978: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.76989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.77003: variable 'omit' from source: magic vars 28285 1727204267.77432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.79792: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.79880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.79924: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.79970: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.80003: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.80091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.80127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.80160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.80212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.80233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.80391: variable 'ansible_distribution' from source: facts 28285 1727204267.80404: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.80429: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.80438: when evaluation is False, skipping this task 28285 1727204267.80445: _execute() done 28285 1727204267.80451: dumping result to json 28285 1727204267.80459: done dumping result, returning 28285 1727204267.80472: done running TaskExecutor() for managed-node1/TASK: Show ethtool_features [0affcd87-79f5-57a1-d976-0000000000b0] 28285 1727204267.80484: sending task result for task 0affcd87-79f5-57a1-d976-0000000000b0 28285 1727204267.80587: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000b0 28285 1727204267.80590: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204267.80644: no more pending results, returning what we have 28285 1727204267.80647: results queue empty 28285 1727204267.80648: checking for any_errors_fatal 28285 1727204267.80654: done checking for any_errors_fatal 28285 1727204267.80654: checking for max_fail_percentage 28285 1727204267.80656: done checking for max_fail_percentage 28285 1727204267.80657: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.80658: done checking to see if all hosts have failed 28285 1727204267.80658: getting the remaining hosts for this loop 28285 1727204267.80660: done getting the remaining hosts for this loop 28285 1727204267.80665: getting the next task for host managed-node1 28285 1727204267.80670: done getting next task for host managed-node1 28285 1727204267.80673: ^ task is: TASK: Assert device features 28285 1727204267.80675: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.80678: getting variables 28285 1727204267.80679: in VariableManager get_vars() 28285 1727204267.80733: Calling all_inventory to load vars for managed-node1 28285 1727204267.80736: Calling groups_inventory to load vars for managed-node1 28285 1727204267.80738: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.80748: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.80750: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.80753: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.80926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.81136: done with get_vars() 28285 1727204267.81148: done getting variables 28285 1727204267.81208: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert device features] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:89 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.050) 0:00:08.622 ***** 28285 1727204267.81237: entering _queue_task() for managed-node1/assert 28285 1727204267.81489: worker is 1 (out of 1 available) 28285 1727204267.81501: exiting _queue_task() for managed-node1/assert 28285 1727204267.81513: done queuing things up, now waiting for results queue to drain 28285 1727204267.81515: waiting for pending results... 28285 1727204267.81789: running TaskExecutor() for managed-node1/TASK: Assert device features 28285 1727204267.81881: in run() - task 0affcd87-79f5-57a1-d976-0000000000b1 28285 1727204267.81899: variable 'ansible_search_path' from source: unknown 28285 1727204267.81941: calling self._execute() 28285 1727204267.82040: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.82044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.82056: variable 'omit' from source: magic vars 28285 1727204267.82371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.84224: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.84302: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.84348: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.84395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.84423: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.84514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.84552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.84596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.84642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.84664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.84840: variable 'ansible_distribution' from source: facts 28285 1727204267.84852: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.84877: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.84891: when evaluation is False, skipping this task 28285 1727204267.84905: _execute() done 28285 1727204267.84915: dumping result to json 28285 1727204267.84924: done dumping result, returning 28285 1727204267.84936: done running TaskExecutor() for managed-node1/TASK: Assert device features [0affcd87-79f5-57a1-d976-0000000000b1] 28285 1727204267.84947: sending task result for task 0affcd87-79f5-57a1-d976-0000000000b1 28285 1727204267.85057: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000b1 28285 1727204267.85060: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.85112: no more pending results, returning what we have 28285 1727204267.85115: results queue empty 28285 1727204267.85116: checking for any_errors_fatal 28285 1727204267.85121: done checking for any_errors_fatal 28285 1727204267.85122: checking for max_fail_percentage 28285 1727204267.85123: done checking for max_fail_percentage 28285 1727204267.85124: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.85125: done checking to see if all hosts have failed 28285 1727204267.85126: getting the remaining hosts for this loop 28285 1727204267.85127: done getting the remaining hosts for this loop 28285 1727204267.85131: getting the next task for host managed-node1 28285 1727204267.85136: done getting next task for host managed-node1 28285 1727204267.85139: ^ task is: TASK: TEST: I can enable tx_tcp_segmentation (using underscores). 28285 1727204267.85142: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.85144: getting variables 28285 1727204267.85146: in VariableManager get_vars() 28285 1727204267.85202: Calling all_inventory to load vars for managed-node1 28285 1727204267.85205: Calling groups_inventory to load vars for managed-node1 28285 1727204267.85207: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.85217: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.85219: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.85221: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.85402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.85520: done with get_vars() 28285 1727204267.85530: done getting variables 28285 1727204267.85574: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can enable tx_tcp_segmentation (using underscores).] ************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:102 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.043) 0:00:08.665 ***** 28285 1727204267.85594: entering _queue_task() for managed-node1/debug 28285 1727204267.85789: worker is 1 (out of 1 available) 28285 1727204267.85802: exiting _queue_task() for managed-node1/debug 28285 1727204267.85815: done queuing things up, now waiting for results queue to drain 28285 1727204267.85817: waiting for pending results... 28285 1727204267.85990: running TaskExecutor() for managed-node1/TASK: TEST: I can enable tx_tcp_segmentation (using underscores). 28285 1727204267.86049: in run() - task 0affcd87-79f5-57a1-d976-0000000000b2 28285 1727204267.86062: variable 'ansible_search_path' from source: unknown 28285 1727204267.86096: calling self._execute() 28285 1727204267.86168: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.86174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.86182: variable 'omit' from source: magic vars 28285 1727204267.86489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.88834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.88927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.88976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.89023: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.89063: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.89141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.89166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.89199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.89223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.89236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.89342: variable 'ansible_distribution' from source: facts 28285 1727204267.89347: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.89367: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.89371: when evaluation is False, skipping this task 28285 1727204267.89373: _execute() done 28285 1727204267.89376: dumping result to json 28285 1727204267.89378: done dumping result, returning 28285 1727204267.89385: done running TaskExecutor() for managed-node1/TASK: TEST: I can enable tx_tcp_segmentation (using underscores). [0affcd87-79f5-57a1-d976-0000000000b2] 28285 1727204267.89391: sending task result for task 0affcd87-79f5-57a1-d976-0000000000b2 28285 1727204267.89481: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000b2 28285 1727204267.89483: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204267.89526: no more pending results, returning what we have 28285 1727204267.89529: results queue empty 28285 1727204267.89530: checking for any_errors_fatal 28285 1727204267.89536: done checking for any_errors_fatal 28285 1727204267.89537: checking for max_fail_percentage 28285 1727204267.89538: done checking for max_fail_percentage 28285 1727204267.89539: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.89540: done checking to see if all hosts have failed 28285 1727204267.89540: getting the remaining hosts for this loop 28285 1727204267.89542: done getting the remaining hosts for this loop 28285 1727204267.89546: getting the next task for host managed-node1 28285 1727204267.89556: done getting next task for host managed-node1 28285 1727204267.89562: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204267.89567: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.89585: getting variables 28285 1727204267.89587: in VariableManager get_vars() 28285 1727204267.89636: Calling all_inventory to load vars for managed-node1 28285 1727204267.89639: Calling groups_inventory to load vars for managed-node1 28285 1727204267.89641: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.89652: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.89654: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.89657: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.89785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.89918: done with get_vars() 28285 1727204267.89927: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.044) 0:00:08.709 ***** 28285 1727204267.89998: entering _queue_task() for managed-node1/include_tasks 28285 1727204267.90200: worker is 1 (out of 1 available) 28285 1727204267.90213: exiting _queue_task() for managed-node1/include_tasks 28285 1727204267.90224: done queuing things up, now waiting for results queue to drain 28285 1727204267.90226: waiting for pending results... 28285 1727204267.90395: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204267.90490: in run() - task 0affcd87-79f5-57a1-d976-0000000000ba 28285 1727204267.90500: variable 'ansible_search_path' from source: unknown 28285 1727204267.90503: variable 'ansible_search_path' from source: unknown 28285 1727204267.90533: calling self._execute() 28285 1727204267.90601: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.90605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.90614: variable 'omit' from source: magic vars 28285 1727204267.91021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.93492: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.93585: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.93631: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.93686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.93718: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.93813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.93844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.93890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.93935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.93957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.94118: variable 'ansible_distribution' from source: facts 28285 1727204267.94131: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.94157: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.94167: when evaluation is False, skipping this task 28285 1727204267.94176: _execute() done 28285 1727204267.94188: dumping result to json 28285 1727204267.94201: done dumping result, returning 28285 1727204267.94213: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-57a1-d976-0000000000ba] 28285 1727204267.94225: sending task result for task 0affcd87-79f5-57a1-d976-0000000000ba skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204267.94389: no more pending results, returning what we have 28285 1727204267.94393: results queue empty 28285 1727204267.94394: checking for any_errors_fatal 28285 1727204267.94401: done checking for any_errors_fatal 28285 1727204267.94402: checking for max_fail_percentage 28285 1727204267.94403: done checking for max_fail_percentage 28285 1727204267.94404: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.94405: done checking to see if all hosts have failed 28285 1727204267.94406: getting the remaining hosts for this loop 28285 1727204267.94408: done getting the remaining hosts for this loop 28285 1727204267.94412: getting the next task for host managed-node1 28285 1727204267.94420: done getting next task for host managed-node1 28285 1727204267.94424: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204267.94428: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.94451: getting variables 28285 1727204267.94454: in VariableManager get_vars() 28285 1727204267.94513: Calling all_inventory to load vars for managed-node1 28285 1727204267.94516: Calling groups_inventory to load vars for managed-node1 28285 1727204267.94519: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.94536: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.94542: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.94547: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.94711: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000ba 28285 1727204267.94714: WORKER PROCESS EXITING 28285 1727204267.94796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.94938: done with get_vars() 28285 1727204267.94945: done getting variables 28285 1727204267.94989: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.050) 0:00:08.760 ***** 28285 1727204267.95014: entering _queue_task() for managed-node1/debug 28285 1727204267.95203: worker is 1 (out of 1 available) 28285 1727204267.95216: exiting _queue_task() for managed-node1/debug 28285 1727204267.95229: done queuing things up, now waiting for results queue to drain 28285 1727204267.95231: waiting for pending results... 28285 1727204267.95400: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204267.95485: in run() - task 0affcd87-79f5-57a1-d976-0000000000bb 28285 1727204267.95497: variable 'ansible_search_path' from source: unknown 28285 1727204267.95501: variable 'ansible_search_path' from source: unknown 28285 1727204267.95527: calling self._execute() 28285 1727204267.95592: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.95597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.95605: variable 'omit' from source: magic vars 28285 1727204267.95901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204267.98018: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204267.98076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204267.98105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204267.98130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204267.98149: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204267.98213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204267.98234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204267.98251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204267.98286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204267.98296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204267.98398: variable 'ansible_distribution' from source: facts 28285 1727204267.98404: variable 'ansible_distribution_major_version' from source: facts 28285 1727204267.98419: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204267.98422: when evaluation is False, skipping this task 28285 1727204267.98424: _execute() done 28285 1727204267.98427: dumping result to json 28285 1727204267.98430: done dumping result, returning 28285 1727204267.98438: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-57a1-d976-0000000000bb] 28285 1727204267.98443: sending task result for task 0affcd87-79f5-57a1-d976-0000000000bb 28285 1727204267.98529: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000bb 28285 1727204267.98532: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204267.98576: no more pending results, returning what we have 28285 1727204267.98579: results queue empty 28285 1727204267.98580: checking for any_errors_fatal 28285 1727204267.98589: done checking for any_errors_fatal 28285 1727204267.98590: checking for max_fail_percentage 28285 1727204267.98591: done checking for max_fail_percentage 28285 1727204267.98592: checking to see if all hosts have failed and the running result is not ok 28285 1727204267.98593: done checking to see if all hosts have failed 28285 1727204267.98593: getting the remaining hosts for this loop 28285 1727204267.98595: done getting the remaining hosts for this loop 28285 1727204267.98599: getting the next task for host managed-node1 28285 1727204267.98605: done getting next task for host managed-node1 28285 1727204267.98609: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204267.98612: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204267.98630: getting variables 28285 1727204267.98631: in VariableManager get_vars() 28285 1727204267.98688: Calling all_inventory to load vars for managed-node1 28285 1727204267.98691: Calling groups_inventory to load vars for managed-node1 28285 1727204267.98693: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204267.98702: Calling all_plugins_play to load vars for managed-node1 28285 1727204267.98704: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204267.98707: Calling groups_plugins_play to load vars for managed-node1 28285 1727204267.98831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204267.98957: done with get_vars() 28285 1727204267.98968: done getting variables 28285 1727204267.99010: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.040) 0:00:08.800 ***** 28285 1727204267.99034: entering _queue_task() for managed-node1/fail 28285 1727204267.99229: worker is 1 (out of 1 available) 28285 1727204267.99242: exiting _queue_task() for managed-node1/fail 28285 1727204267.99253: done queuing things up, now waiting for results queue to drain 28285 1727204267.99255: waiting for pending results... 28285 1727204267.99420: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204267.99512: in run() - task 0affcd87-79f5-57a1-d976-0000000000bc 28285 1727204267.99523: variable 'ansible_search_path' from source: unknown 28285 1727204267.99526: variable 'ansible_search_path' from source: unknown 28285 1727204267.99561: calling self._execute() 28285 1727204267.99623: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204267.99629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204267.99636: variable 'omit' from source: magic vars 28285 1727204267.99945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.01617: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.01669: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.01698: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.01730: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.01753: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.01812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.01834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.01852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.01888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.01899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.02013: variable 'ansible_distribution' from source: facts 28285 1727204268.02019: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.02039: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.02043: when evaluation is False, skipping this task 28285 1727204268.02046: _execute() done 28285 1727204268.02049: dumping result to json 28285 1727204268.02051: done dumping result, returning 28285 1727204268.02057: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-57a1-d976-0000000000bc] 28285 1727204268.02065: sending task result for task 0affcd87-79f5-57a1-d976-0000000000bc 28285 1727204268.02155: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000bc 28285 1727204268.02158: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.02225: no more pending results, returning what we have 28285 1727204268.02228: results queue empty 28285 1727204268.02229: checking for any_errors_fatal 28285 1727204268.02235: done checking for any_errors_fatal 28285 1727204268.02235: checking for max_fail_percentage 28285 1727204268.02237: done checking for max_fail_percentage 28285 1727204268.02238: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.02239: done checking to see if all hosts have failed 28285 1727204268.02239: getting the remaining hosts for this loop 28285 1727204268.02241: done getting the remaining hosts for this loop 28285 1727204268.02245: getting the next task for host managed-node1 28285 1727204268.02254: done getting next task for host managed-node1 28285 1727204268.02258: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204268.02261: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.02282: getting variables 28285 1727204268.02284: in VariableManager get_vars() 28285 1727204268.02332: Calling all_inventory to load vars for managed-node1 28285 1727204268.02334: Calling groups_inventory to load vars for managed-node1 28285 1727204268.02336: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.02345: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.02347: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.02352: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.02515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.02645: done with get_vars() 28285 1727204268.02655: done getting variables 28285 1727204268.02699: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.036) 0:00:08.837 ***** 28285 1727204268.02722: entering _queue_task() for managed-node1/fail 28285 1727204268.02926: worker is 1 (out of 1 available) 28285 1727204268.02939: exiting _queue_task() for managed-node1/fail 28285 1727204268.02953: done queuing things up, now waiting for results queue to drain 28285 1727204268.02955: waiting for pending results... 28285 1727204268.03126: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204268.03212: in run() - task 0affcd87-79f5-57a1-d976-0000000000bd 28285 1727204268.03223: variable 'ansible_search_path' from source: unknown 28285 1727204268.03228: variable 'ansible_search_path' from source: unknown 28285 1727204268.03259: calling self._execute() 28285 1727204268.03319: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.03325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.03332: variable 'omit' from source: magic vars 28285 1727204268.03637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.05241: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.05296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.05324: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.05353: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.05373: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.05431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.05454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.05474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.05500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.05511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.05612: variable 'ansible_distribution' from source: facts 28285 1727204268.05618: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.05636: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.05639: when evaluation is False, skipping this task 28285 1727204268.05641: _execute() done 28285 1727204268.05644: dumping result to json 28285 1727204268.05646: done dumping result, returning 28285 1727204268.05654: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-57a1-d976-0000000000bd] 28285 1727204268.05660: sending task result for task 0affcd87-79f5-57a1-d976-0000000000bd 28285 1727204268.05745: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000bd 28285 1727204268.05750: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.05817: no more pending results, returning what we have 28285 1727204268.05820: results queue empty 28285 1727204268.05821: checking for any_errors_fatal 28285 1727204268.05826: done checking for any_errors_fatal 28285 1727204268.05827: checking for max_fail_percentage 28285 1727204268.05828: done checking for max_fail_percentage 28285 1727204268.05830: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.05830: done checking to see if all hosts have failed 28285 1727204268.05831: getting the remaining hosts for this loop 28285 1727204268.05832: done getting the remaining hosts for this loop 28285 1727204268.05837: getting the next task for host managed-node1 28285 1727204268.05842: done getting next task for host managed-node1 28285 1727204268.05846: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204268.05851: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.05876: getting variables 28285 1727204268.05878: in VariableManager get_vars() 28285 1727204268.05925: Calling all_inventory to load vars for managed-node1 28285 1727204268.05928: Calling groups_inventory to load vars for managed-node1 28285 1727204268.05930: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.05938: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.05940: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.05941: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.06061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.06193: done with get_vars() 28285 1727204268.06201: done getting variables 28285 1727204268.06241: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.035) 0:00:08.872 ***** 28285 1727204268.06268: entering _queue_task() for managed-node1/fail 28285 1727204268.06466: worker is 1 (out of 1 available) 28285 1727204268.06481: exiting _queue_task() for managed-node1/fail 28285 1727204268.06493: done queuing things up, now waiting for results queue to drain 28285 1727204268.06495: waiting for pending results... 28285 1727204268.06657: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204268.06745: in run() - task 0affcd87-79f5-57a1-d976-0000000000be 28285 1727204268.06758: variable 'ansible_search_path' from source: unknown 28285 1727204268.06762: variable 'ansible_search_path' from source: unknown 28285 1727204268.06795: calling self._execute() 28285 1727204268.06862: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.06871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.06876: variable 'omit' from source: magic vars 28285 1727204268.07180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.08790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.08839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.08869: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.08902: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.08924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.08982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.09004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.09026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.09054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.09067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.09174: variable 'ansible_distribution' from source: facts 28285 1727204268.09180: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.09194: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.09198: when evaluation is False, skipping this task 28285 1727204268.09200: _execute() done 28285 1727204268.09203: dumping result to json 28285 1727204268.09205: done dumping result, returning 28285 1727204268.09212: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-57a1-d976-0000000000be] 28285 1727204268.09221: sending task result for task 0affcd87-79f5-57a1-d976-0000000000be 28285 1727204268.09307: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000be 28285 1727204268.09310: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.09370: no more pending results, returning what we have 28285 1727204268.09373: results queue empty 28285 1727204268.09374: checking for any_errors_fatal 28285 1727204268.09378: done checking for any_errors_fatal 28285 1727204268.09379: checking for max_fail_percentage 28285 1727204268.09381: done checking for max_fail_percentage 28285 1727204268.09382: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.09383: done checking to see if all hosts have failed 28285 1727204268.09383: getting the remaining hosts for this loop 28285 1727204268.09385: done getting the remaining hosts for this loop 28285 1727204268.09389: getting the next task for host managed-node1 28285 1727204268.09395: done getting next task for host managed-node1 28285 1727204268.09399: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204268.09402: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.09419: getting variables 28285 1727204268.09421: in VariableManager get_vars() 28285 1727204268.09476: Calling all_inventory to load vars for managed-node1 28285 1727204268.09479: Calling groups_inventory to load vars for managed-node1 28285 1727204268.09480: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.09489: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.09491: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.09493: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.09644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.09771: done with get_vars() 28285 1727204268.09778: done getting variables 28285 1727204268.09818: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.035) 0:00:08.908 ***** 28285 1727204268.09838: entering _queue_task() for managed-node1/dnf 28285 1727204268.10028: worker is 1 (out of 1 available) 28285 1727204268.10042: exiting _queue_task() for managed-node1/dnf 28285 1727204268.10054: done queuing things up, now waiting for results queue to drain 28285 1727204268.10056: waiting for pending results... 28285 1727204268.10221: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204268.10301: in run() - task 0affcd87-79f5-57a1-d976-0000000000bf 28285 1727204268.10318: variable 'ansible_search_path' from source: unknown 28285 1727204268.10322: variable 'ansible_search_path' from source: unknown 28285 1727204268.10351: calling self._execute() 28285 1727204268.10412: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.10420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.10432: variable 'omit' from source: magic vars 28285 1727204268.10735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.12334: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.12390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.12418: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.12447: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.12467: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.12525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.12544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.12565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.12595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.12613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.12716: variable 'ansible_distribution' from source: facts 28285 1727204268.12722: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.12737: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.12740: when evaluation is False, skipping this task 28285 1727204268.12743: _execute() done 28285 1727204268.12746: dumping result to json 28285 1727204268.12748: done dumping result, returning 28285 1727204268.12758: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-0000000000bf] 28285 1727204268.12763: sending task result for task 0affcd87-79f5-57a1-d976-0000000000bf 28285 1727204268.12852: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000bf 28285 1727204268.12855: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.12929: no more pending results, returning what we have 28285 1727204268.12932: results queue empty 28285 1727204268.12933: checking for any_errors_fatal 28285 1727204268.12939: done checking for any_errors_fatal 28285 1727204268.12939: checking for max_fail_percentage 28285 1727204268.12941: done checking for max_fail_percentage 28285 1727204268.12942: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.12943: done checking to see if all hosts have failed 28285 1727204268.12943: getting the remaining hosts for this loop 28285 1727204268.12945: done getting the remaining hosts for this loop 28285 1727204268.12949: getting the next task for host managed-node1 28285 1727204268.12956: done getting next task for host managed-node1 28285 1727204268.12960: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204268.12962: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.12983: getting variables 28285 1727204268.12985: in VariableManager get_vars() 28285 1727204268.13030: Calling all_inventory to load vars for managed-node1 28285 1727204268.13033: Calling groups_inventory to load vars for managed-node1 28285 1727204268.13035: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.13043: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.13046: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.13048: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.13162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.13382: done with get_vars() 28285 1727204268.13390: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204268.13442: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.036) 0:00:08.944 ***** 28285 1727204268.13467: entering _queue_task() for managed-node1/yum 28285 1727204268.13660: worker is 1 (out of 1 available) 28285 1727204268.13674: exiting _queue_task() for managed-node1/yum 28285 1727204268.13686: done queuing things up, now waiting for results queue to drain 28285 1727204268.13688: waiting for pending results... 28285 1727204268.13853: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204268.13933: in run() - task 0affcd87-79f5-57a1-d976-0000000000c0 28285 1727204268.13944: variable 'ansible_search_path' from source: unknown 28285 1727204268.13947: variable 'ansible_search_path' from source: unknown 28285 1727204268.13980: calling self._execute() 28285 1727204268.14041: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.14045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.14056: variable 'omit' from source: magic vars 28285 1727204268.14360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.16740: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.16789: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.16817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.16847: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.16869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.16929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.16954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.16973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.16999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.17010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.17111: variable 'ansible_distribution' from source: facts 28285 1727204268.17117: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.17132: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.17137: when evaluation is False, skipping this task 28285 1727204268.17142: _execute() done 28285 1727204268.17147: dumping result to json 28285 1727204268.17154: done dumping result, returning 28285 1727204268.17165: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-0000000000c0] 28285 1727204268.17171: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c0 28285 1727204268.17260: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c0 28285 1727204268.17265: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.17313: no more pending results, returning what we have 28285 1727204268.17316: results queue empty 28285 1727204268.17316: checking for any_errors_fatal 28285 1727204268.17322: done checking for any_errors_fatal 28285 1727204268.17323: checking for max_fail_percentage 28285 1727204268.17325: done checking for max_fail_percentage 28285 1727204268.17325: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.17326: done checking to see if all hosts have failed 28285 1727204268.17327: getting the remaining hosts for this loop 28285 1727204268.17329: done getting the remaining hosts for this loop 28285 1727204268.17332: getting the next task for host managed-node1 28285 1727204268.17338: done getting next task for host managed-node1 28285 1727204268.17342: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204268.17345: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.17368: getting variables 28285 1727204268.17370: in VariableManager get_vars() 28285 1727204268.17419: Calling all_inventory to load vars for managed-node1 28285 1727204268.17422: Calling groups_inventory to load vars for managed-node1 28285 1727204268.17424: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.17432: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.17435: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.17437: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.17599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.17723: done with get_vars() 28285 1727204268.17731: done getting variables 28285 1727204268.17773: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.043) 0:00:08.987 ***** 28285 1727204268.17798: entering _queue_task() for managed-node1/fail 28285 1727204268.17988: worker is 1 (out of 1 available) 28285 1727204268.18000: exiting _queue_task() for managed-node1/fail 28285 1727204268.18012: done queuing things up, now waiting for results queue to drain 28285 1727204268.18014: waiting for pending results... 28285 1727204268.18213: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204268.18343: in run() - task 0affcd87-79f5-57a1-d976-0000000000c1 28285 1727204268.18369: variable 'ansible_search_path' from source: unknown 28285 1727204268.18378: variable 'ansible_search_path' from source: unknown 28285 1727204268.18414: calling self._execute() 28285 1727204268.18503: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.18514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.18526: variable 'omit' from source: magic vars 28285 1727204268.18958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.21384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.21475: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.21522: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.21565: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.21597: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.21685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.21719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.21758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.21807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.21825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.21984: variable 'ansible_distribution' from source: facts 28285 1727204268.21996: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.22016: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.22022: when evaluation is False, skipping this task 28285 1727204268.22028: _execute() done 28285 1727204268.22033: dumping result to json 28285 1727204268.22038: done dumping result, returning 28285 1727204268.22047: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-0000000000c1] 28285 1727204268.22059: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c1 28285 1727204268.22169: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c1 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.22219: no more pending results, returning what we have 28285 1727204268.22223: results queue empty 28285 1727204268.22224: checking for any_errors_fatal 28285 1727204268.22230: done checking for any_errors_fatal 28285 1727204268.22231: checking for max_fail_percentage 28285 1727204268.22232: done checking for max_fail_percentage 28285 1727204268.22233: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.22234: done checking to see if all hosts have failed 28285 1727204268.22235: getting the remaining hosts for this loop 28285 1727204268.22236: done getting the remaining hosts for this loop 28285 1727204268.22240: getting the next task for host managed-node1 28285 1727204268.22246: done getting next task for host managed-node1 28285 1727204268.22253: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28285 1727204268.22256: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.22276: getting variables 28285 1727204268.22278: in VariableManager get_vars() 28285 1727204268.22333: Calling all_inventory to load vars for managed-node1 28285 1727204268.22336: Calling groups_inventory to load vars for managed-node1 28285 1727204268.22339: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.22352: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.22354: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.22357: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.22539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.22757: done with get_vars() 28285 1727204268.22772: done getting variables 28285 1727204268.22845: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.050) 0:00:09.038 ***** 28285 1727204268.23002: entering _queue_task() for managed-node1/package 28285 1727204268.23109: WORKER PROCESS EXITING 28285 1727204268.23440: worker is 1 (out of 1 available) 28285 1727204268.23455: exiting _queue_task() for managed-node1/package 28285 1727204268.23469: done queuing things up, now waiting for results queue to drain 28285 1727204268.23471: waiting for pending results... 28285 1727204268.23740: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 28285 1727204268.23877: in run() - task 0affcd87-79f5-57a1-d976-0000000000c2 28285 1727204268.23895: variable 'ansible_search_path' from source: unknown 28285 1727204268.23902: variable 'ansible_search_path' from source: unknown 28285 1727204268.23943: calling self._execute() 28285 1727204268.24033: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.24044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.24060: variable 'omit' from source: magic vars 28285 1727204268.24509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.27095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.27177: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.27219: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.27271: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.27303: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.27394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.27429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.27472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.27522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.27544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.27702: variable 'ansible_distribution' from source: facts 28285 1727204268.27716: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.27739: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.27747: when evaluation is False, skipping this task 28285 1727204268.27758: _execute() done 28285 1727204268.27768: dumping result to json 28285 1727204268.27777: done dumping result, returning 28285 1727204268.27794: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-57a1-d976-0000000000c2] 28285 1727204268.27806: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c2 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.27977: no more pending results, returning what we have 28285 1727204268.27980: results queue empty 28285 1727204268.27981: checking for any_errors_fatal 28285 1727204268.27987: done checking for any_errors_fatal 28285 1727204268.27988: checking for max_fail_percentage 28285 1727204268.27990: done checking for max_fail_percentage 28285 1727204268.27991: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.27992: done checking to see if all hosts have failed 28285 1727204268.27993: getting the remaining hosts for this loop 28285 1727204268.27995: done getting the remaining hosts for this loop 28285 1727204268.27999: getting the next task for host managed-node1 28285 1727204268.28006: done getting next task for host managed-node1 28285 1727204268.28010: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204268.28013: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.28035: getting variables 28285 1727204268.28037: in VariableManager get_vars() 28285 1727204268.28098: Calling all_inventory to load vars for managed-node1 28285 1727204268.28101: Calling groups_inventory to load vars for managed-node1 28285 1727204268.28104: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.28116: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.28118: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.28122: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.28358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.28579: done with get_vars() 28285 1727204268.28676: done getting variables 28285 1727204268.28770: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c2 28285 1727204268.28774: WORKER PROCESS EXITING 28285 1727204268.28816: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.059) 0:00:09.098 ***** 28285 1727204268.28860: entering _queue_task() for managed-node1/package 28285 1727204268.29309: worker is 1 (out of 1 available) 28285 1727204268.29322: exiting _queue_task() for managed-node1/package 28285 1727204268.29334: done queuing things up, now waiting for results queue to drain 28285 1727204268.29335: waiting for pending results... 28285 1727204268.29615: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204268.29759: in run() - task 0affcd87-79f5-57a1-d976-0000000000c3 28285 1727204268.29788: variable 'ansible_search_path' from source: unknown 28285 1727204268.29797: variable 'ansible_search_path' from source: unknown 28285 1727204268.29837: calling self._execute() 28285 1727204268.29929: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.29941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.29956: variable 'omit' from source: magic vars 28285 1727204268.30357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.32102: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.32185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.32231: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.32277: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.32320: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.32404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.32443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.32475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.32533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.32554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.32731: variable 'ansible_distribution' from source: facts 28285 1727204268.32749: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.32807: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.32836: when evaluation is False, skipping this task 28285 1727204268.32862: _execute() done 28285 1727204268.32871: dumping result to json 28285 1727204268.32878: done dumping result, returning 28285 1727204268.32888: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-57a1-d976-0000000000c3] 28285 1727204268.32898: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c3 28285 1727204268.33028: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c3 28285 1727204268.33031: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.33124: no more pending results, returning what we have 28285 1727204268.33128: results queue empty 28285 1727204268.33128: checking for any_errors_fatal 28285 1727204268.33134: done checking for any_errors_fatal 28285 1727204268.33135: checking for max_fail_percentage 28285 1727204268.33136: done checking for max_fail_percentage 28285 1727204268.33137: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.33138: done checking to see if all hosts have failed 28285 1727204268.33139: getting the remaining hosts for this loop 28285 1727204268.33140: done getting the remaining hosts for this loop 28285 1727204268.33144: getting the next task for host managed-node1 28285 1727204268.33151: done getting next task for host managed-node1 28285 1727204268.33155: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204268.33157: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.33176: getting variables 28285 1727204268.33178: in VariableManager get_vars() 28285 1727204268.33237: Calling all_inventory to load vars for managed-node1 28285 1727204268.33240: Calling groups_inventory to load vars for managed-node1 28285 1727204268.33242: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.33250: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.33253: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.33256: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.33379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.33503: done with get_vars() 28285 1727204268.33512: done getting variables 28285 1727204268.33555: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.047) 0:00:09.145 ***** 28285 1727204268.33580: entering _queue_task() for managed-node1/package 28285 1727204268.33771: worker is 1 (out of 1 available) 28285 1727204268.33782: exiting _queue_task() for managed-node1/package 28285 1727204268.33795: done queuing things up, now waiting for results queue to drain 28285 1727204268.33797: waiting for pending results... 28285 1727204268.33962: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204268.34049: in run() - task 0affcd87-79f5-57a1-d976-0000000000c4 28285 1727204268.34062: variable 'ansible_search_path' from source: unknown 28285 1727204268.34067: variable 'ansible_search_path' from source: unknown 28285 1727204268.34097: calling self._execute() 28285 1727204268.34155: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.34161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.34170: variable 'omit' from source: magic vars 28285 1727204268.34473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.36715: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.36785: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.36833: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.36875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.36907: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.36993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.37030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.37061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.37110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.37135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.37286: variable 'ansible_distribution' from source: facts 28285 1727204268.37291: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.37307: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.37310: when evaluation is False, skipping this task 28285 1727204268.37312: _execute() done 28285 1727204268.37314: dumping result to json 28285 1727204268.37317: done dumping result, returning 28285 1727204268.37327: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-57a1-d976-0000000000c4] 28285 1727204268.37335: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c4 28285 1727204268.37438: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c4 28285 1727204268.37441: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.37488: no more pending results, returning what we have 28285 1727204268.37492: results queue empty 28285 1727204268.37492: checking for any_errors_fatal 28285 1727204268.37498: done checking for any_errors_fatal 28285 1727204268.37498: checking for max_fail_percentage 28285 1727204268.37500: done checking for max_fail_percentage 28285 1727204268.37501: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.37502: done checking to see if all hosts have failed 28285 1727204268.37503: getting the remaining hosts for this loop 28285 1727204268.37505: done getting the remaining hosts for this loop 28285 1727204268.37509: getting the next task for host managed-node1 28285 1727204268.37515: done getting next task for host managed-node1 28285 1727204268.37518: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204268.37521: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.37540: getting variables 28285 1727204268.37541: in VariableManager get_vars() 28285 1727204268.37594: Calling all_inventory to load vars for managed-node1 28285 1727204268.37597: Calling groups_inventory to load vars for managed-node1 28285 1727204268.37599: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.37607: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.37609: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.37612: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.37781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.37905: done with get_vars() 28285 1727204268.37913: done getting variables 28285 1727204268.37954: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.043) 0:00:09.189 ***** 28285 1727204268.37980: entering _queue_task() for managed-node1/service 28285 1727204268.38182: worker is 1 (out of 1 available) 28285 1727204268.38195: exiting _queue_task() for managed-node1/service 28285 1727204268.38207: done queuing things up, now waiting for results queue to drain 28285 1727204268.38209: waiting for pending results... 28285 1727204268.38381: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204268.38471: in run() - task 0affcd87-79f5-57a1-d976-0000000000c5 28285 1727204268.38481: variable 'ansible_search_path' from source: unknown 28285 1727204268.38484: variable 'ansible_search_path' from source: unknown 28285 1727204268.38512: calling self._execute() 28285 1727204268.38576: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.38580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.38588: variable 'omit' from source: magic vars 28285 1727204268.38898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.41412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.41502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.41545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.41593: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.41623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.41710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.41743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.41779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.41831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.41852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.42006: variable 'ansible_distribution' from source: facts 28285 1727204268.42025: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.42049: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.42057: when evaluation is False, skipping this task 28285 1727204268.42063: _execute() done 28285 1727204268.42073: dumping result to json 28285 1727204268.42080: done dumping result, returning 28285 1727204268.42091: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-0000000000c5] 28285 1727204268.42101: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c5 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.42255: no more pending results, returning what we have 28285 1727204268.42259: results queue empty 28285 1727204268.42260: checking for any_errors_fatal 28285 1727204268.42269: done checking for any_errors_fatal 28285 1727204268.42270: checking for max_fail_percentage 28285 1727204268.42271: done checking for max_fail_percentage 28285 1727204268.42272: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.42273: done checking to see if all hosts have failed 28285 1727204268.42274: getting the remaining hosts for this loop 28285 1727204268.42275: done getting the remaining hosts for this loop 28285 1727204268.42279: getting the next task for host managed-node1 28285 1727204268.42285: done getting next task for host managed-node1 28285 1727204268.42289: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204268.42292: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.42313: getting variables 28285 1727204268.42315: in VariableManager get_vars() 28285 1727204268.42375: Calling all_inventory to load vars for managed-node1 28285 1727204268.42378: Calling groups_inventory to load vars for managed-node1 28285 1727204268.42380: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.42391: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.42393: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.42396: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.42551: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c5 28285 1727204268.42555: WORKER PROCESS EXITING 28285 1727204268.42581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.42897: done with get_vars() 28285 1727204268.42906: done getting variables 28285 1727204268.42948: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.049) 0:00:09.239 ***** 28285 1727204268.42979: entering _queue_task() for managed-node1/service 28285 1727204268.43189: worker is 1 (out of 1 available) 28285 1727204268.43203: exiting _queue_task() for managed-node1/service 28285 1727204268.43215: done queuing things up, now waiting for results queue to drain 28285 1727204268.43216: waiting for pending results... 28285 1727204268.43386: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204268.43473: in run() - task 0affcd87-79f5-57a1-d976-0000000000c6 28285 1727204268.43484: variable 'ansible_search_path' from source: unknown 28285 1727204268.43488: variable 'ansible_search_path' from source: unknown 28285 1727204268.43516: calling self._execute() 28285 1727204268.43578: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.43581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.43589: variable 'omit' from source: magic vars 28285 1727204268.43898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.46007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.46060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.46091: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.46116: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.46136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.46197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.46216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.46234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.46268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.46275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.46377: variable 'ansible_distribution' from source: facts 28285 1727204268.46382: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.46402: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.46405: when evaluation is False, skipping this task 28285 1727204268.46407: _execute() done 28285 1727204268.46410: dumping result to json 28285 1727204268.46412: done dumping result, returning 28285 1727204268.46420: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-57a1-d976-0000000000c6] 28285 1727204268.46425: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c6 28285 1727204268.46516: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c6 28285 1727204268.46519: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204268.46558: no more pending results, returning what we have 28285 1727204268.46561: results queue empty 28285 1727204268.46562: checking for any_errors_fatal 28285 1727204268.46569: done checking for any_errors_fatal 28285 1727204268.46570: checking for max_fail_percentage 28285 1727204268.46571: done checking for max_fail_percentage 28285 1727204268.46572: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.46573: done checking to see if all hosts have failed 28285 1727204268.46574: getting the remaining hosts for this loop 28285 1727204268.46575: done getting the remaining hosts for this loop 28285 1727204268.46579: getting the next task for host managed-node1 28285 1727204268.46585: done getting next task for host managed-node1 28285 1727204268.46589: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204268.46592: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.46611: getting variables 28285 1727204268.46613: in VariableManager get_vars() 28285 1727204268.46667: Calling all_inventory to load vars for managed-node1 28285 1727204268.46670: Calling groups_inventory to load vars for managed-node1 28285 1727204268.46672: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.46682: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.46684: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.46687: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.46824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.46990: done with get_vars() 28285 1727204268.46997: done getting variables 28285 1727204268.47042: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.040) 0:00:09.280 ***** 28285 1727204268.47068: entering _queue_task() for managed-node1/service 28285 1727204268.47268: worker is 1 (out of 1 available) 28285 1727204268.47283: exiting _queue_task() for managed-node1/service 28285 1727204268.47294: done queuing things up, now waiting for results queue to drain 28285 1727204268.47296: waiting for pending results... 28285 1727204268.47488: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204268.47566: in run() - task 0affcd87-79f5-57a1-d976-0000000000c7 28285 1727204268.47578: variable 'ansible_search_path' from source: unknown 28285 1727204268.47582: variable 'ansible_search_path' from source: unknown 28285 1727204268.47609: calling self._execute() 28285 1727204268.47668: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.47674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.47683: variable 'omit' from source: magic vars 28285 1727204268.47991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.49675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.49728: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.49759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.49787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.49809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.49869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.49891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.49909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.49936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.49947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.50051: variable 'ansible_distribution' from source: facts 28285 1727204268.50060: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.50084: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.50087: when evaluation is False, skipping this task 28285 1727204268.50090: _execute() done 28285 1727204268.50094: dumping result to json 28285 1727204268.50096: done dumping result, returning 28285 1727204268.50101: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-57a1-d976-0000000000c7] 28285 1727204268.50110: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c7 28285 1727204268.50200: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c7 28285 1727204268.50203: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.50249: no more pending results, returning what we have 28285 1727204268.50252: results queue empty 28285 1727204268.50253: checking for any_errors_fatal 28285 1727204268.50259: done checking for any_errors_fatal 28285 1727204268.50259: checking for max_fail_percentage 28285 1727204268.50261: done checking for max_fail_percentage 28285 1727204268.50262: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.50263: done checking to see if all hosts have failed 28285 1727204268.50273: getting the remaining hosts for this loop 28285 1727204268.50275: done getting the remaining hosts for this loop 28285 1727204268.50279: getting the next task for host managed-node1 28285 1727204268.50286: done getting next task for host managed-node1 28285 1727204268.50290: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204268.50293: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.50310: getting variables 28285 1727204268.50312: in VariableManager get_vars() 28285 1727204268.50366: Calling all_inventory to load vars for managed-node1 28285 1727204268.50369: Calling groups_inventory to load vars for managed-node1 28285 1727204268.50371: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.50380: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.50381: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.50384: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.50511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.50638: done with get_vars() 28285 1727204268.50650: done getting variables 28285 1727204268.50692: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.036) 0:00:09.317 ***** 28285 1727204268.50714: entering _queue_task() for managed-node1/service 28285 1727204268.50914: worker is 1 (out of 1 available) 28285 1727204268.50927: exiting _queue_task() for managed-node1/service 28285 1727204268.50938: done queuing things up, now waiting for results queue to drain 28285 1727204268.50940: waiting for pending results... 28285 1727204268.51159: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204268.51295: in run() - task 0affcd87-79f5-57a1-d976-0000000000c8 28285 1727204268.51317: variable 'ansible_search_path' from source: unknown 28285 1727204268.51325: variable 'ansible_search_path' from source: unknown 28285 1727204268.51365: calling self._execute() 28285 1727204268.51452: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.51465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.51478: variable 'omit' from source: magic vars 28285 1727204268.51937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.53981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.54027: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.54059: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.54086: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.54113: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.54175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.54194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.54216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.54242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.54257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.54368: variable 'ansible_distribution' from source: facts 28285 1727204268.54371: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.54387: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.54390: when evaluation is False, skipping this task 28285 1727204268.54392: _execute() done 28285 1727204268.54395: dumping result to json 28285 1727204268.54397: done dumping result, returning 28285 1727204268.54405: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-57a1-d976-0000000000c8] 28285 1727204268.54411: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c8 28285 1727204268.54502: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c8 28285 1727204268.54505: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204268.54572: no more pending results, returning what we have 28285 1727204268.54575: results queue empty 28285 1727204268.54576: checking for any_errors_fatal 28285 1727204268.54581: done checking for any_errors_fatal 28285 1727204268.54582: checking for max_fail_percentage 28285 1727204268.54583: done checking for max_fail_percentage 28285 1727204268.54585: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.54585: done checking to see if all hosts have failed 28285 1727204268.54586: getting the remaining hosts for this loop 28285 1727204268.54588: done getting the remaining hosts for this loop 28285 1727204268.54592: getting the next task for host managed-node1 28285 1727204268.54597: done getting next task for host managed-node1 28285 1727204268.54602: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204268.54604: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.54623: getting variables 28285 1727204268.54625: in VariableManager get_vars() 28285 1727204268.54683: Calling all_inventory to load vars for managed-node1 28285 1727204268.54685: Calling groups_inventory to load vars for managed-node1 28285 1727204268.54688: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.54696: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.54698: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.54700: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.54947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.55174: done with get_vars() 28285 1727204268.55184: done getting variables 28285 1727204268.55250: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.045) 0:00:09.362 ***** 28285 1727204268.55284: entering _queue_task() for managed-node1/copy 28285 1727204268.55582: worker is 1 (out of 1 available) 28285 1727204268.55595: exiting _queue_task() for managed-node1/copy 28285 1727204268.55605: done queuing things up, now waiting for results queue to drain 28285 1727204268.55607: waiting for pending results... 28285 1727204268.55908: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204268.56060: in run() - task 0affcd87-79f5-57a1-d976-0000000000c9 28285 1727204268.56086: variable 'ansible_search_path' from source: unknown 28285 1727204268.56097: variable 'ansible_search_path' from source: unknown 28285 1727204268.56136: calling self._execute() 28285 1727204268.56241: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.56257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.56278: variable 'omit' from source: magic vars 28285 1727204268.56746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.59315: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.59690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.59736: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.59784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.59814: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.59903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.59940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.59983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.60029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.60047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.60210: variable 'ansible_distribution' from source: facts 28285 1727204268.60221: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.60244: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.60256: when evaluation is False, skipping this task 28285 1727204268.60265: _execute() done 28285 1727204268.60274: dumping result to json 28285 1727204268.60282: done dumping result, returning 28285 1727204268.60293: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-57a1-d976-0000000000c9] 28285 1727204268.60309: sending task result for task 0affcd87-79f5-57a1-d976-0000000000c9 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.60457: no more pending results, returning what we have 28285 1727204268.60461: results queue empty 28285 1727204268.60462: checking for any_errors_fatal 28285 1727204268.60470: done checking for any_errors_fatal 28285 1727204268.60471: checking for max_fail_percentage 28285 1727204268.60473: done checking for max_fail_percentage 28285 1727204268.60474: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.60475: done checking to see if all hosts have failed 28285 1727204268.60476: getting the remaining hosts for this loop 28285 1727204268.60477: done getting the remaining hosts for this loop 28285 1727204268.60482: getting the next task for host managed-node1 28285 1727204268.60489: done getting next task for host managed-node1 28285 1727204268.60493: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204268.60497: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.60516: getting variables 28285 1727204268.60518: in VariableManager get_vars() 28285 1727204268.60580: Calling all_inventory to load vars for managed-node1 28285 1727204268.60583: Calling groups_inventory to load vars for managed-node1 28285 1727204268.60586: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.60598: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.60601: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.60605: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.60795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.61026: done with get_vars() 28285 1727204268.61037: done getting variables 28285 1727204268.61241: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000c9 28285 1727204268.61245: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.059) 0:00:09.422 ***** 28285 1727204268.61266: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204268.61740: worker is 1 (out of 1 available) 28285 1727204268.61755: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204268.61773: done queuing things up, now waiting for results queue to drain 28285 1727204268.61775: waiting for pending results... 28285 1727204268.62037: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204268.62178: in run() - task 0affcd87-79f5-57a1-d976-0000000000ca 28285 1727204268.62202: variable 'ansible_search_path' from source: unknown 28285 1727204268.62210: variable 'ansible_search_path' from source: unknown 28285 1727204268.62253: calling self._execute() 28285 1727204268.62341: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.62355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.62370: variable 'omit' from source: magic vars 28285 1727204268.62822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.65712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.65793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.65834: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.65878: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.65913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.66003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.66038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.66073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.66126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.66151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.66305: variable 'ansible_distribution' from source: facts 28285 1727204268.66324: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.66355: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.66363: when evaluation is False, skipping this task 28285 1727204268.66370: _execute() done 28285 1727204268.66375: dumping result to json 28285 1727204268.66381: done dumping result, returning 28285 1727204268.66390: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-57a1-d976-0000000000ca] 28285 1727204268.66399: sending task result for task 0affcd87-79f5-57a1-d976-0000000000ca skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.66546: no more pending results, returning what we have 28285 1727204268.66552: results queue empty 28285 1727204268.66553: checking for any_errors_fatal 28285 1727204268.66559: done checking for any_errors_fatal 28285 1727204268.66559: checking for max_fail_percentage 28285 1727204268.66561: done checking for max_fail_percentage 28285 1727204268.66562: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.66562: done checking to see if all hosts have failed 28285 1727204268.66568: getting the remaining hosts for this loop 28285 1727204268.66570: done getting the remaining hosts for this loop 28285 1727204268.66574: getting the next task for host managed-node1 28285 1727204268.66582: done getting next task for host managed-node1 28285 1727204268.66586: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204268.66590: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.66611: getting variables 28285 1727204268.66613: in VariableManager get_vars() 28285 1727204268.66681: Calling all_inventory to load vars for managed-node1 28285 1727204268.66684: Calling groups_inventory to load vars for managed-node1 28285 1727204268.66687: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.66699: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.66701: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.66704: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.66956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.67190: done with get_vars() 28285 1727204268.67279: done getting variables 28285 1727204268.67429: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000ca 28285 1727204268.67433: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.062) 0:00:09.485 ***** 28285 1727204268.67508: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204268.67951: worker is 1 (out of 1 available) 28285 1727204268.67969: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204268.67983: done queuing things up, now waiting for results queue to drain 28285 1727204268.67985: waiting for pending results... 28285 1727204268.68266: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204268.68413: in run() - task 0affcd87-79f5-57a1-d976-0000000000cb 28285 1727204268.68437: variable 'ansible_search_path' from source: unknown 28285 1727204268.68444: variable 'ansible_search_path' from source: unknown 28285 1727204268.68488: calling self._execute() 28285 1727204268.68577: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.68587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.68598: variable 'omit' from source: magic vars 28285 1727204268.69059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.71786: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.71860: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.71912: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.71968: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.72005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.72090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.72132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.72169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.72219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.72243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.72407: variable 'ansible_distribution' from source: facts 28285 1727204268.72419: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.72456: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.72465: when evaluation is False, skipping this task 28285 1727204268.72473: _execute() done 28285 1727204268.72478: dumping result to json 28285 1727204268.72485: done dumping result, returning 28285 1727204268.72495: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-57a1-d976-0000000000cb] 28285 1727204268.72505: sending task result for task 0affcd87-79f5-57a1-d976-0000000000cb skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.72670: no more pending results, returning what we have 28285 1727204268.72674: results queue empty 28285 1727204268.72676: checking for any_errors_fatal 28285 1727204268.72683: done checking for any_errors_fatal 28285 1727204268.72684: checking for max_fail_percentage 28285 1727204268.72686: done checking for max_fail_percentage 28285 1727204268.72687: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.72688: done checking to see if all hosts have failed 28285 1727204268.72689: getting the remaining hosts for this loop 28285 1727204268.72691: done getting the remaining hosts for this loop 28285 1727204268.72696: getting the next task for host managed-node1 28285 1727204268.72703: done getting next task for host managed-node1 28285 1727204268.72708: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204268.72711: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.72731: getting variables 28285 1727204268.72733: in VariableManager get_vars() 28285 1727204268.72798: Calling all_inventory to load vars for managed-node1 28285 1727204268.72801: Calling groups_inventory to load vars for managed-node1 28285 1727204268.72804: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.72817: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.72820: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.72824: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.73015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.73256: done with get_vars() 28285 1727204268.73272: done getting variables 28285 1727204268.73410: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000cb 28285 1727204268.73413: WORKER PROCESS EXITING 28285 1727204268.73457: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.059) 0:00:09.545 ***** 28285 1727204268.73502: entering _queue_task() for managed-node1/debug 28285 1727204268.74031: worker is 1 (out of 1 available) 28285 1727204268.74042: exiting _queue_task() for managed-node1/debug 28285 1727204268.74062: done queuing things up, now waiting for results queue to drain 28285 1727204268.74064: waiting for pending results... 28285 1727204268.74337: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204268.74475: in run() - task 0affcd87-79f5-57a1-d976-0000000000cc 28285 1727204268.74498: variable 'ansible_search_path' from source: unknown 28285 1727204268.74509: variable 'ansible_search_path' from source: unknown 28285 1727204268.74551: calling self._execute() 28285 1727204268.74643: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.74656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.74672: variable 'omit' from source: magic vars 28285 1727204268.75131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.79033: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.79251: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.79388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.79432: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.79468: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.79709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.79750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.79785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.79937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.79960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.80226: variable 'ansible_distribution' from source: facts 28285 1727204268.80374: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.80402: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.80410: when evaluation is False, skipping this task 28285 1727204268.80417: _execute() done 28285 1727204268.80422: dumping result to json 28285 1727204268.80431: done dumping result, returning 28285 1727204268.80443: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-57a1-d976-0000000000cc] 28285 1727204268.80456: sending task result for task 0affcd87-79f5-57a1-d976-0000000000cc skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204268.80657: no more pending results, returning what we have 28285 1727204268.80661: results queue empty 28285 1727204268.80662: checking for any_errors_fatal 28285 1727204268.80670: done checking for any_errors_fatal 28285 1727204268.80671: checking for max_fail_percentage 28285 1727204268.80673: done checking for max_fail_percentage 28285 1727204268.80674: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.80675: done checking to see if all hosts have failed 28285 1727204268.80676: getting the remaining hosts for this loop 28285 1727204268.80678: done getting the remaining hosts for this loop 28285 1727204268.80682: getting the next task for host managed-node1 28285 1727204268.80690: done getting next task for host managed-node1 28285 1727204268.80694: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204268.80697: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.80719: getting variables 28285 1727204268.80721: in VariableManager get_vars() 28285 1727204268.80786: Calling all_inventory to load vars for managed-node1 28285 1727204268.80789: Calling groups_inventory to load vars for managed-node1 28285 1727204268.80792: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.80803: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.80806: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.80809: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.80999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.81421: done with get_vars() 28285 1727204268.81433: done getting variables 28285 1727204268.81472: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000cc 28285 1727204268.81475: WORKER PROCESS EXITING 28285 1727204268.81630: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.081) 0:00:09.626 ***** 28285 1727204268.81670: entering _queue_task() for managed-node1/debug 28285 1727204268.82017: worker is 1 (out of 1 available) 28285 1727204268.82031: exiting _queue_task() for managed-node1/debug 28285 1727204268.82043: done queuing things up, now waiting for results queue to drain 28285 1727204268.82045: waiting for pending results... 28285 1727204268.83068: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204268.83315: in run() - task 0affcd87-79f5-57a1-d976-0000000000cd 28285 1727204268.83335: variable 'ansible_search_path' from source: unknown 28285 1727204268.83344: variable 'ansible_search_path' from source: unknown 28285 1727204268.83394: calling self._execute() 28285 1727204268.83483: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.83497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.83511: variable 'omit' from source: magic vars 28285 1727204268.83931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.86023: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.86105: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.86168: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.86212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.86251: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.86342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.86383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.86419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.86475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.86494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.86668: variable 'ansible_distribution' from source: facts 28285 1727204268.86676: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.86692: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.86694: when evaluation is False, skipping this task 28285 1727204268.86697: _execute() done 28285 1727204268.86700: dumping result to json 28285 1727204268.86702: done dumping result, returning 28285 1727204268.86710: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-57a1-d976-0000000000cd] 28285 1727204268.86715: sending task result for task 0affcd87-79f5-57a1-d976-0000000000cd 28285 1727204268.86807: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000cd 28285 1727204268.86810: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204268.86858: no more pending results, returning what we have 28285 1727204268.86862: results queue empty 28285 1727204268.86862: checking for any_errors_fatal 28285 1727204268.86870: done checking for any_errors_fatal 28285 1727204268.86870: checking for max_fail_percentage 28285 1727204268.86872: done checking for max_fail_percentage 28285 1727204268.86873: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.86874: done checking to see if all hosts have failed 28285 1727204268.86874: getting the remaining hosts for this loop 28285 1727204268.86876: done getting the remaining hosts for this loop 28285 1727204268.86880: getting the next task for host managed-node1 28285 1727204268.86886: done getting next task for host managed-node1 28285 1727204268.86890: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204268.86893: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.86911: getting variables 28285 1727204268.86913: in VariableManager get_vars() 28285 1727204268.86972: Calling all_inventory to load vars for managed-node1 28285 1727204268.86975: Calling groups_inventory to load vars for managed-node1 28285 1727204268.86977: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.86987: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.86989: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.86991: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.87126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.87256: done with get_vars() 28285 1727204268.87266: done getting variables 28285 1727204268.87310: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.056) 0:00:09.683 ***** 28285 1727204268.87333: entering _queue_task() for managed-node1/debug 28285 1727204268.87530: worker is 1 (out of 1 available) 28285 1727204268.87544: exiting _queue_task() for managed-node1/debug 28285 1727204268.87559: done queuing things up, now waiting for results queue to drain 28285 1727204268.87561: waiting for pending results... 28285 1727204268.87722: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204268.87810: in run() - task 0affcd87-79f5-57a1-d976-0000000000ce 28285 1727204268.87821: variable 'ansible_search_path' from source: unknown 28285 1727204268.87825: variable 'ansible_search_path' from source: unknown 28285 1727204268.87856: calling self._execute() 28285 1727204268.87915: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.87920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.87928: variable 'omit' from source: magic vars 28285 1727204268.88223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.90696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.90765: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.90780: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.90808: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.90830: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.90901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.90921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.90938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.90969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.90980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.91086: variable 'ansible_distribution' from source: facts 28285 1727204268.91097: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.91113: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.91116: when evaluation is False, skipping this task 28285 1727204268.91119: _execute() done 28285 1727204268.91121: dumping result to json 28285 1727204268.91123: done dumping result, returning 28285 1727204268.91131: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-57a1-d976-0000000000ce] 28285 1727204268.91136: sending task result for task 0affcd87-79f5-57a1-d976-0000000000ce 28285 1727204268.91233: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000ce 28285 1727204268.91236: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204268.91281: no more pending results, returning what we have 28285 1727204268.91285: results queue empty 28285 1727204268.91286: checking for any_errors_fatal 28285 1727204268.91292: done checking for any_errors_fatal 28285 1727204268.91292: checking for max_fail_percentage 28285 1727204268.91294: done checking for max_fail_percentage 28285 1727204268.91295: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.91296: done checking to see if all hosts have failed 28285 1727204268.91297: getting the remaining hosts for this loop 28285 1727204268.91298: done getting the remaining hosts for this loop 28285 1727204268.91302: getting the next task for host managed-node1 28285 1727204268.91309: done getting next task for host managed-node1 28285 1727204268.91313: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204268.91316: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.91333: getting variables 28285 1727204268.91335: in VariableManager get_vars() 28285 1727204268.91389: Calling all_inventory to load vars for managed-node1 28285 1727204268.91392: Calling groups_inventory to load vars for managed-node1 28285 1727204268.91394: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.91403: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.91405: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.91408: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.91729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.92076: done with get_vars() 28285 1727204268.92083: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.048) 0:00:09.731 ***** 28285 1727204268.92155: entering _queue_task() for managed-node1/ping 28285 1727204268.92373: worker is 1 (out of 1 available) 28285 1727204268.92385: exiting _queue_task() for managed-node1/ping 28285 1727204268.92399: done queuing things up, now waiting for results queue to drain 28285 1727204268.92401: waiting for pending results... 28285 1727204268.92676: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204268.92821: in run() - task 0affcd87-79f5-57a1-d976-0000000000cf 28285 1727204268.92844: variable 'ansible_search_path' from source: unknown 28285 1727204268.92851: variable 'ansible_search_path' from source: unknown 28285 1727204268.92894: calling self._execute() 28285 1727204268.92980: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.92990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.93006: variable 'omit' from source: magic vars 28285 1727204268.93435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204268.96543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204268.96647: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204268.96693: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204268.96735: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204268.96770: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204268.96849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204268.96886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204268.96917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204268.96956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204268.96979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204268.97123: variable 'ansible_distribution' from source: facts 28285 1727204268.97137: variable 'ansible_distribution_major_version' from source: facts 28285 1727204268.97161: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204268.97171: when evaluation is False, skipping this task 28285 1727204268.97179: _execute() done 28285 1727204268.97185: dumping result to json 28285 1727204268.97193: done dumping result, returning 28285 1727204268.97204: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-57a1-d976-0000000000cf] 28285 1727204268.97215: sending task result for task 0affcd87-79f5-57a1-d976-0000000000cf 28285 1727204268.97322: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000cf 28285 1727204268.97331: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204268.97398: no more pending results, returning what we have 28285 1727204268.97401: results queue empty 28285 1727204268.97402: checking for any_errors_fatal 28285 1727204268.97409: done checking for any_errors_fatal 28285 1727204268.97409: checking for max_fail_percentage 28285 1727204268.97411: done checking for max_fail_percentage 28285 1727204268.97412: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.97413: done checking to see if all hosts have failed 28285 1727204268.97413: getting the remaining hosts for this loop 28285 1727204268.97415: done getting the remaining hosts for this loop 28285 1727204268.97418: getting the next task for host managed-node1 28285 1727204268.97428: done getting next task for host managed-node1 28285 1727204268.97431: ^ task is: TASK: meta (role_complete) 28285 1727204268.97434: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.97454: getting variables 28285 1727204268.97456: in VariableManager get_vars() 28285 1727204268.97512: Calling all_inventory to load vars for managed-node1 28285 1727204268.97515: Calling groups_inventory to load vars for managed-node1 28285 1727204268.97518: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.97528: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.97530: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.97532: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.97718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.97963: done with get_vars() 28285 1727204268.97978: done getting variables 28285 1727204268.98187: done queuing things up, now waiting for results queue to drain 28285 1727204268.98190: results queue empty 28285 1727204268.98191: checking for any_errors_fatal 28285 1727204268.98193: done checking for any_errors_fatal 28285 1727204268.98194: checking for max_fail_percentage 28285 1727204268.98195: done checking for max_fail_percentage 28285 1727204268.98195: checking to see if all hosts have failed and the running result is not ok 28285 1727204268.98196: done checking to see if all hosts have failed 28285 1727204268.98196: getting the remaining hosts for this loop 28285 1727204268.98197: done getting the remaining hosts for this loop 28285 1727204268.98199: getting the next task for host managed-node1 28285 1727204268.98202: done getting next task for host managed-node1 28285 1727204268.98203: ^ task is: TASK: Get current device features 28285 1727204268.98205: ^ state is: HOST STATE: block=3, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204268.98206: getting variables 28285 1727204268.98207: in VariableManager get_vars() 28285 1727204268.98224: Calling all_inventory to load vars for managed-node1 28285 1727204268.98226: Calling groups_inventory to load vars for managed-node1 28285 1727204268.98227: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204268.98231: Calling all_plugins_play to load vars for managed-node1 28285 1727204268.98232: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204268.98234: Calling groups_plugins_play to load vars for managed-node1 28285 1727204268.98392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204268.98641: done with get_vars() 28285 1727204268.98659: done getting variables 28285 1727204268.98712: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get current device features] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:120 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.065) 0:00:09.797 ***** 28285 1727204268.98740: entering _queue_task() for managed-node1/command 28285 1727204268.99079: worker is 1 (out of 1 available) 28285 1727204268.99091: exiting _queue_task() for managed-node1/command 28285 1727204268.99105: done queuing things up, now waiting for results queue to drain 28285 1727204268.99107: waiting for pending results... 28285 1727204268.99426: running TaskExecutor() for managed-node1/TASK: Get current device features 28285 1727204268.99544: in run() - task 0affcd87-79f5-57a1-d976-0000000000ff 28285 1727204268.99559: variable 'ansible_search_path' from source: unknown 28285 1727204268.99594: calling self._execute() 28285 1727204268.99670: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204268.99678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204268.99686: variable 'omit' from source: magic vars 28285 1727204269.00042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.02533: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.02625: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.02673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.02714: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.02744: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.02829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.02869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.02901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.02945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.02968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.03119: variable 'ansible_distribution' from source: facts 28285 1727204269.03131: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.03156: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.03165: when evaluation is False, skipping this task 28285 1727204269.03173: _execute() done 28285 1727204269.03179: dumping result to json 28285 1727204269.03185: done dumping result, returning 28285 1727204269.03196: done running TaskExecutor() for managed-node1/TASK: Get current device features [0affcd87-79f5-57a1-d976-0000000000ff] 28285 1727204269.03207: sending task result for task 0affcd87-79f5-57a1-d976-0000000000ff skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.03356: no more pending results, returning what we have 28285 1727204269.03360: results queue empty 28285 1727204269.03361: checking for any_errors_fatal 28285 1727204269.03362: done checking for any_errors_fatal 28285 1727204269.03363: checking for max_fail_percentage 28285 1727204269.03366: done checking for max_fail_percentage 28285 1727204269.03367: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.03368: done checking to see if all hosts have failed 28285 1727204269.03369: getting the remaining hosts for this loop 28285 1727204269.03371: done getting the remaining hosts for this loop 28285 1727204269.03375: getting the next task for host managed-node1 28285 1727204269.03380: done getting next task for host managed-node1 28285 1727204269.03383: ^ task is: TASK: Show ethtool_features 28285 1727204269.03385: ^ state is: HOST STATE: block=3, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.03387: getting variables 28285 1727204269.03389: in VariableManager get_vars() 28285 1727204269.03442: Calling all_inventory to load vars for managed-node1 28285 1727204269.03445: Calling groups_inventory to load vars for managed-node1 28285 1727204269.03447: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.03458: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.03461: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.03465: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.03623: done sending task result for task 0affcd87-79f5-57a1-d976-0000000000ff 28285 1727204269.03627: WORKER PROCESS EXITING 28285 1727204269.03650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.03878: done with get_vars() 28285 1727204269.03890: done getting variables 28285 1727204269.03954: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ethtool_features] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:124 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.052) 0:00:09.849 ***** 28285 1727204269.03983: entering _queue_task() for managed-node1/debug 28285 1727204269.04278: worker is 1 (out of 1 available) 28285 1727204269.04290: exiting _queue_task() for managed-node1/debug 28285 1727204269.04303: done queuing things up, now waiting for results queue to drain 28285 1727204269.04304: waiting for pending results... 28285 1727204269.04579: running TaskExecutor() for managed-node1/TASK: Show ethtool_features 28285 1727204269.04689: in run() - task 0affcd87-79f5-57a1-d976-000000000100 28285 1727204269.04717: variable 'ansible_search_path' from source: unknown 28285 1727204269.04762: calling self._execute() 28285 1727204269.04871: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.04882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.04895: variable 'omit' from source: magic vars 28285 1727204269.05351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.08000: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.08075: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.08475: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.08479: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.08481: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.08484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.08487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.08489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.08491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.08493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.08515: variable 'ansible_distribution' from source: facts 28285 1727204269.08522: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.08539: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.08543: when evaluation is False, skipping this task 28285 1727204269.08545: _execute() done 28285 1727204269.08548: dumping result to json 28285 1727204269.08554: done dumping result, returning 28285 1727204269.08562: done running TaskExecutor() for managed-node1/TASK: Show ethtool_features [0affcd87-79f5-57a1-d976-000000000100] 28285 1727204269.08569: sending task result for task 0affcd87-79f5-57a1-d976-000000000100 28285 1727204269.08656: done sending task result for task 0affcd87-79f5-57a1-d976-000000000100 28285 1727204269.08659: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204269.08725: no more pending results, returning what we have 28285 1727204269.08728: results queue empty 28285 1727204269.08729: checking for any_errors_fatal 28285 1727204269.08733: done checking for any_errors_fatal 28285 1727204269.08734: checking for max_fail_percentage 28285 1727204269.08736: done checking for max_fail_percentage 28285 1727204269.08737: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.08737: done checking to see if all hosts have failed 28285 1727204269.08738: getting the remaining hosts for this loop 28285 1727204269.08740: done getting the remaining hosts for this loop 28285 1727204269.08743: getting the next task for host managed-node1 28285 1727204269.08749: done getting next task for host managed-node1 28285 1727204269.08752: ^ task is: TASK: Assert device features 28285 1727204269.08754: ^ state is: HOST STATE: block=3, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.08757: getting variables 28285 1727204269.08758: in VariableManager get_vars() 28285 1727204269.08812: Calling all_inventory to load vars for managed-node1 28285 1727204269.08814: Calling groups_inventory to load vars for managed-node1 28285 1727204269.08817: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.08826: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.08829: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.08831: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.09014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.09275: done with get_vars() 28285 1727204269.09285: done getting variables 28285 1727204269.09349: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert device features] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:127 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.053) 0:00:09.903 ***** 28285 1727204269.09381: entering _queue_task() for managed-node1/assert 28285 1727204269.09654: worker is 1 (out of 1 available) 28285 1727204269.09668: exiting _queue_task() for managed-node1/assert 28285 1727204269.09680: done queuing things up, now waiting for results queue to drain 28285 1727204269.09682: waiting for pending results... 28285 1727204269.09959: running TaskExecutor() for managed-node1/TASK: Assert device features 28285 1727204269.10057: in run() - task 0affcd87-79f5-57a1-d976-000000000101 28285 1727204269.10082: variable 'ansible_search_path' from source: unknown 28285 1727204269.10127: calling self._execute() 28285 1727204269.10223: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.10238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.10252: variable 'omit' from source: magic vars 28285 1727204269.10710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.13147: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.13243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.13293: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.13335: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.13378: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.13462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.13503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.13536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.13596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.13616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.13769: variable 'ansible_distribution' from source: facts 28285 1727204269.13841: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.13869: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.13878: when evaluation is False, skipping this task 28285 1727204269.13886: _execute() done 28285 1727204269.13897: dumping result to json 28285 1727204269.13904: done dumping result, returning 28285 1727204269.13915: done running TaskExecutor() for managed-node1/TASK: Assert device features [0affcd87-79f5-57a1-d976-000000000101] 28285 1727204269.13926: sending task result for task 0affcd87-79f5-57a1-d976-000000000101 28285 1727204269.14043: done sending task result for task 0affcd87-79f5-57a1-d976-000000000101 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.14093: no more pending results, returning what we have 28285 1727204269.14097: results queue empty 28285 1727204269.14098: checking for any_errors_fatal 28285 1727204269.14105: done checking for any_errors_fatal 28285 1727204269.14106: checking for max_fail_percentage 28285 1727204269.14108: done checking for max_fail_percentage 28285 1727204269.14109: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.14110: done checking to see if all hosts have failed 28285 1727204269.14111: getting the remaining hosts for this loop 28285 1727204269.14113: done getting the remaining hosts for this loop 28285 1727204269.14117: getting the next task for host managed-node1 28285 1727204269.14125: done getting next task for host managed-node1 28285 1727204269.14128: ^ task is: TASK: TEST: Change feature with both underscores and dashes. 28285 1727204269.14132: ^ state is: HOST STATE: block=3, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.14136: getting variables 28285 1727204269.14140: in VariableManager get_vars() 28285 1727204269.14197: Calling all_inventory to load vars for managed-node1 28285 1727204269.14200: Calling groups_inventory to load vars for managed-node1 28285 1727204269.14203: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.14214: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.14217: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.14220: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.14411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.14626: done with get_vars() 28285 1727204269.14638: done getting variables 28285 1727204269.14679: WORKER PROCESS EXITING 28285 1727204269.14746: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: Change feature with both underscores and dashes.] ****************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:136 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.053) 0:00:09.957 ***** 28285 1727204269.14783: entering _queue_task() for managed-node1/debug 28285 1727204269.16646: worker is 1 (out of 1 available) 28285 1727204269.16658: exiting _queue_task() for managed-node1/debug 28285 1727204269.16674: done queuing things up, now waiting for results queue to drain 28285 1727204269.16676: waiting for pending results... 28285 1727204269.16831: running TaskExecutor() for managed-node1/TASK: TEST: Change feature with both underscores and dashes. 28285 1727204269.16923: in run() - task 0affcd87-79f5-57a1-d976-000000000103 28285 1727204269.16935: variable 'ansible_search_path' from source: unknown 28285 1727204269.16976: calling self._execute() 28285 1727204269.17065: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.17071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.17081: variable 'omit' from source: magic vars 28285 1727204269.17519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.20027: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.20092: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.20128: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.20163: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.20190: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.20267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.20294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.20317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.20361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.20375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.20517: variable 'ansible_distribution' from source: facts 28285 1727204269.20522: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.20541: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.20545: when evaluation is False, skipping this task 28285 1727204269.20547: _execute() done 28285 1727204269.20550: dumping result to json 28285 1727204269.20555: done dumping result, returning 28285 1727204269.20565: done running TaskExecutor() for managed-node1/TASK: TEST: Change feature with both underscores and dashes. [0affcd87-79f5-57a1-d976-000000000103] 28285 1727204269.20572: sending task result for task 0affcd87-79f5-57a1-d976-000000000103 28285 1727204269.20658: done sending task result for task 0affcd87-79f5-57a1-d976-000000000103 28285 1727204269.20661: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204269.20735: no more pending results, returning what we have 28285 1727204269.20738: results queue empty 28285 1727204269.20739: checking for any_errors_fatal 28285 1727204269.20745: done checking for any_errors_fatal 28285 1727204269.20746: checking for max_fail_percentage 28285 1727204269.20748: done checking for max_fail_percentage 28285 1727204269.20749: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.20750: done checking to see if all hosts have failed 28285 1727204269.20750: getting the remaining hosts for this loop 28285 1727204269.20752: done getting the remaining hosts for this loop 28285 1727204269.20756: getting the next task for host managed-node1 28285 1727204269.20762: done getting next task for host managed-node1 28285 1727204269.20767: ^ task is: TASK: Configure ethtool features setting 28285 1727204269.20770: ^ state is: HOST STATE: block=3, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.20774: getting variables 28285 1727204269.20776: in VariableManager get_vars() 28285 1727204269.20827: Calling all_inventory to load vars for managed-node1 28285 1727204269.20830: Calling groups_inventory to load vars for managed-node1 28285 1727204269.20832: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.20841: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.20843: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.20846: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.21067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.21270: done with get_vars() 28285 1727204269.21280: done getting variables TASK [Configure ethtool features setting] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:140 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.065) 0:00:10.023 ***** 28285 1727204269.21358: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204269.22329: worker is 1 (out of 1 available) 28285 1727204269.22340: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204269.22351: done queuing things up, now waiting for results queue to drain 28285 1727204269.22353: waiting for pending results... 28285 1727204269.23248: running TaskExecutor() for managed-node1/TASK: Configure ethtool features setting 28285 1727204269.23343: in run() - task 0affcd87-79f5-57a1-d976-000000000104 28285 1727204269.23360: variable 'ansible_search_path' from source: unknown 28285 1727204269.23400: calling self._execute() 28285 1727204269.23496: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.23500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.23507: variable 'omit' from source: magic vars 28285 1727204269.24760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.30103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.30185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.30224: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.30260: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.30289: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.30369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.30402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.30427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.30475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.30489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.30847: variable 'ansible_distribution' from source: facts 28285 1727204269.30856: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.31582: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.31585: when evaluation is False, skipping this task 28285 1727204269.31588: _execute() done 28285 1727204269.31591: dumping result to json 28285 1727204269.31593: done dumping result, returning 28285 1727204269.31600: done running TaskExecutor() for managed-node1/TASK: Configure ethtool features setting [0affcd87-79f5-57a1-d976-000000000104] 28285 1727204269.31607: sending task result for task 0affcd87-79f5-57a1-d976-000000000104 28285 1727204269.31719: done sending task result for task 0affcd87-79f5-57a1-d976-000000000104 28285 1727204269.31722: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.31770: no more pending results, returning what we have 28285 1727204269.31773: results queue empty 28285 1727204269.31774: checking for any_errors_fatal 28285 1727204269.31780: done checking for any_errors_fatal 28285 1727204269.31781: checking for max_fail_percentage 28285 1727204269.31783: done checking for max_fail_percentage 28285 1727204269.31784: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.31784: done checking to see if all hosts have failed 28285 1727204269.31785: getting the remaining hosts for this loop 28285 1727204269.31787: done getting the remaining hosts for this loop 28285 1727204269.31791: getting the next task for host managed-node1 28285 1727204269.31798: done getting next task for host managed-node1 28285 1727204269.31800: ^ task is: TASK: Check failure 28285 1727204269.31803: ^ state is: HOST STATE: block=3, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.31806: getting variables 28285 1727204269.31808: in VariableManager get_vars() 28285 1727204269.31860: Calling all_inventory to load vars for managed-node1 28285 1727204269.31865: Calling groups_inventory to load vars for managed-node1 28285 1727204269.31867: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.31878: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.31880: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.31883: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.32051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.32262: done with get_vars() 28285 1727204269.32276: done getting variables 28285 1727204269.32334: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check failure] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:170 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.110) 0:00:10.133 ***** 28285 1727204269.32363: entering _queue_task() for managed-node1/debug 28285 1727204269.33351: worker is 1 (out of 1 available) 28285 1727204269.33363: exiting _queue_task() for managed-node1/debug 28285 1727204269.33376: done queuing things up, now waiting for results queue to drain 28285 1727204269.33378: waiting for pending results... 28285 1727204269.34019: running TaskExecutor() for managed-node1/TASK: Check failure 28285 1727204269.34769: in run() - task 0affcd87-79f5-57a1-d976-000000000107 28285 1727204269.34788: variable 'ansible_search_path' from source: unknown 28285 1727204269.34832: calling self._execute() 28285 1727204269.34929: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.34942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.34959: variable 'omit' from source: magic vars 28285 1727204269.35574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.40936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.41015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.41058: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.41102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.41131: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.41215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.41252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.41499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.41544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.41569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.42019: variable 'ansible_distribution' from source: facts 28285 1727204269.42030: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.42055: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.42061: when evaluation is False, skipping this task 28285 1727204269.42068: _execute() done 28285 1727204269.42074: dumping result to json 28285 1727204269.42079: done dumping result, returning 28285 1727204269.42088: done running TaskExecutor() for managed-node1/TASK: Check failure [0affcd87-79f5-57a1-d976-000000000107] 28285 1727204269.42096: sending task result for task 0affcd87-79f5-57a1-d976-000000000107 skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204269.42229: no more pending results, returning what we have 28285 1727204269.42233: results queue empty 28285 1727204269.42233: checking for any_errors_fatal 28285 1727204269.42242: done checking for any_errors_fatal 28285 1727204269.42243: checking for max_fail_percentage 28285 1727204269.42245: done checking for max_fail_percentage 28285 1727204269.42246: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.42246: done checking to see if all hosts have failed 28285 1727204269.42247: getting the remaining hosts for this loop 28285 1727204269.42249: done getting the remaining hosts for this loop 28285 1727204269.42253: getting the next task for host managed-node1 28285 1727204269.42260: done getting next task for host managed-node1 28285 1727204269.42262: ^ task is: TASK: Assert that the result is failure 28285 1727204269.42267: ^ state is: HOST STATE: block=3, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.42270: getting variables 28285 1727204269.42272: in VariableManager get_vars() 28285 1727204269.42323: Calling all_inventory to load vars for managed-node1 28285 1727204269.42325: Calling groups_inventory to load vars for managed-node1 28285 1727204269.42327: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.42338: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.42340: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.42342: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.42562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.42766: done with get_vars() 28285 1727204269.42778: done getting variables 28285 1727204269.42915: done sending task result for task 0affcd87-79f5-57a1-d976-000000000107 28285 1727204269.42918: WORKER PROCESS EXITING 28285 1727204269.42955: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the result is failure] *************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:173 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.106) 0:00:10.239 ***** 28285 1727204269.42986: entering _queue_task() for managed-node1/assert 28285 1727204269.43251: worker is 1 (out of 1 available) 28285 1727204269.43266: exiting _queue_task() for managed-node1/assert 28285 1727204269.43278: done queuing things up, now waiting for results queue to drain 28285 1727204269.43280: waiting for pending results... 28285 1727204269.43559: running TaskExecutor() for managed-node1/TASK: Assert that the result is failure 28285 1727204269.43672: in run() - task 0affcd87-79f5-57a1-d976-000000000108 28285 1727204269.43691: variable 'ansible_search_path' from source: unknown 28285 1727204269.43735: calling self._execute() 28285 1727204269.43830: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.43848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.43867: variable 'omit' from source: magic vars 28285 1727204269.44412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.46948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.47042: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.47094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.47154: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.47188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.47273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.47306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.47343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.47390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.47410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.47579: variable 'ansible_distribution' from source: facts 28285 1727204269.47590: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.47612: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.47619: when evaluation is False, skipping this task 28285 1727204269.47625: _execute() done 28285 1727204269.47632: dumping result to json 28285 1727204269.47639: done dumping result, returning 28285 1727204269.47655: done running TaskExecutor() for managed-node1/TASK: Assert that the result is failure [0affcd87-79f5-57a1-d976-000000000108] 28285 1727204269.47667: sending task result for task 0affcd87-79f5-57a1-d976-000000000108 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.47802: no more pending results, returning what we have 28285 1727204269.47806: results queue empty 28285 1727204269.47807: checking for any_errors_fatal 28285 1727204269.47812: done checking for any_errors_fatal 28285 1727204269.47813: checking for max_fail_percentage 28285 1727204269.47815: done checking for max_fail_percentage 28285 1727204269.47816: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.47817: done checking to see if all hosts have failed 28285 1727204269.47818: getting the remaining hosts for this loop 28285 1727204269.47820: done getting the remaining hosts for this loop 28285 1727204269.47823: getting the next task for host managed-node1 28285 1727204269.47831: done getting next task for host managed-node1 28285 1727204269.47833: ^ task is: TASK: TEST: I can reset features to their original value. 28285 1727204269.47836: ^ state is: HOST STATE: block=3, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.47840: getting variables 28285 1727204269.47842: in VariableManager get_vars() 28285 1727204269.47897: Calling all_inventory to load vars for managed-node1 28285 1727204269.47900: Calling groups_inventory to load vars for managed-node1 28285 1727204269.47902: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.47913: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.47916: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.47919: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.48096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.48322: done with get_vars() 28285 1727204269.48345: done getting variables 28285 1727204269.48414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can reset features to their original value.] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:177 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.054) 0:00:10.294 ***** 28285 1727204269.48449: entering _queue_task() for managed-node1/debug 28285 1727204269.48470: done sending task result for task 0affcd87-79f5-57a1-d976-000000000108 28285 1727204269.48490: WORKER PROCESS EXITING 28285 1727204269.49023: worker is 1 (out of 1 available) 28285 1727204269.49034: exiting _queue_task() for managed-node1/debug 28285 1727204269.49049: done queuing things up, now waiting for results queue to drain 28285 1727204269.49051: waiting for pending results... 28285 1727204269.49318: running TaskExecutor() for managed-node1/TASK: TEST: I can reset features to their original value. 28285 1727204269.49413: in run() - task 0affcd87-79f5-57a1-d976-000000000109 28285 1727204269.49430: variable 'ansible_search_path' from source: unknown 28285 1727204269.49471: calling self._execute() 28285 1727204269.49571: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.49584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.49606: variable 'omit' from source: magic vars 28285 1727204269.50046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.55271: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.55342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.55385: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.55424: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.55454: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.55534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.55572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.55602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.55647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.55668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.55812: variable 'ansible_distribution' from source: facts 28285 1727204269.55823: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.55845: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.55853: when evaluation is False, skipping this task 28285 1727204269.55859: _execute() done 28285 1727204269.55867: dumping result to json 28285 1727204269.55875: done dumping result, returning 28285 1727204269.55885: done running TaskExecutor() for managed-node1/TASK: TEST: I can reset features to their original value. [0affcd87-79f5-57a1-d976-000000000109] 28285 1727204269.55895: sending task result for task 0affcd87-79f5-57a1-d976-000000000109 28285 1727204269.55995: done sending task result for task 0affcd87-79f5-57a1-d976-000000000109 28285 1727204269.56002: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204269.56063: no more pending results, returning what we have 28285 1727204269.56068: results queue empty 28285 1727204269.56069: checking for any_errors_fatal 28285 1727204269.56074: done checking for any_errors_fatal 28285 1727204269.56074: checking for max_fail_percentage 28285 1727204269.56076: done checking for max_fail_percentage 28285 1727204269.56077: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.56078: done checking to see if all hosts have failed 28285 1727204269.56078: getting the remaining hosts for this loop 28285 1727204269.56080: done getting the remaining hosts for this loop 28285 1727204269.56084: getting the next task for host managed-node1 28285 1727204269.56091: done getting next task for host managed-node1 28285 1727204269.56097: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204269.56100: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.56117: getting variables 28285 1727204269.56118: in VariableManager get_vars() 28285 1727204269.56173: Calling all_inventory to load vars for managed-node1 28285 1727204269.56176: Calling groups_inventory to load vars for managed-node1 28285 1727204269.56178: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.56188: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.56190: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.56193: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.56432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.56976: done with get_vars() 28285 1727204269.56989: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.086) 0:00:10.380 ***** 28285 1727204269.57092: entering _queue_task() for managed-node1/include_tasks 28285 1727204269.57388: worker is 1 (out of 1 available) 28285 1727204269.57401: exiting _queue_task() for managed-node1/include_tasks 28285 1727204269.57412: done queuing things up, now waiting for results queue to drain 28285 1727204269.57414: waiting for pending results... 28285 1727204269.57913: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204269.58038: in run() - task 0affcd87-79f5-57a1-d976-000000000111 28285 1727204269.58062: variable 'ansible_search_path' from source: unknown 28285 1727204269.58068: variable 'ansible_search_path' from source: unknown 28285 1727204269.58114: calling self._execute() 28285 1727204269.58200: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.58204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.58232: variable 'omit' from source: magic vars 28285 1727204269.58799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.61524: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.61629: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.61684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.61731: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.61768: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.61861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.61903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.61942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.61997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.62020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.62186: variable 'ansible_distribution' from source: facts 28285 1727204269.62199: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.62229: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.62239: when evaluation is False, skipping this task 28285 1727204269.62253: _execute() done 28285 1727204269.62266: dumping result to json 28285 1727204269.62274: done dumping result, returning 28285 1727204269.62287: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-57a1-d976-000000000111] 28285 1727204269.62298: sending task result for task 0affcd87-79f5-57a1-d976-000000000111 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.62468: no more pending results, returning what we have 28285 1727204269.62472: results queue empty 28285 1727204269.62473: checking for any_errors_fatal 28285 1727204269.62481: done checking for any_errors_fatal 28285 1727204269.62481: checking for max_fail_percentage 28285 1727204269.62483: done checking for max_fail_percentage 28285 1727204269.62484: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.62485: done checking to see if all hosts have failed 28285 1727204269.62486: getting the remaining hosts for this loop 28285 1727204269.62488: done getting the remaining hosts for this loop 28285 1727204269.62493: getting the next task for host managed-node1 28285 1727204269.62499: done getting next task for host managed-node1 28285 1727204269.62503: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204269.62507: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.62525: getting variables 28285 1727204269.62528: in VariableManager get_vars() 28285 1727204269.62590: Calling all_inventory to load vars for managed-node1 28285 1727204269.62593: Calling groups_inventory to load vars for managed-node1 28285 1727204269.62596: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.62607: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.62609: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.62612: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.62808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.63043: done with get_vars() 28285 1727204269.63058: done getting variables 28285 1727204269.63218: done sending task result for task 0affcd87-79f5-57a1-d976-000000000111 28285 1727204269.63221: WORKER PROCESS EXITING 28285 1727204269.63250: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.061) 0:00:10.442 ***** 28285 1727204269.63289: entering _queue_task() for managed-node1/debug 28285 1727204269.63743: worker is 1 (out of 1 available) 28285 1727204269.63766: exiting _queue_task() for managed-node1/debug 28285 1727204269.63779: done queuing things up, now waiting for results queue to drain 28285 1727204269.63781: waiting for pending results... 28285 1727204269.64060: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204269.64193: in run() - task 0affcd87-79f5-57a1-d976-000000000112 28285 1727204269.64211: variable 'ansible_search_path' from source: unknown 28285 1727204269.64221: variable 'ansible_search_path' from source: unknown 28285 1727204269.64259: calling self._execute() 28285 1727204269.64352: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.64367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.64381: variable 'omit' from source: magic vars 28285 1727204269.64960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.68953: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.69027: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.69075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.69202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.69205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.69223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.69261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.69287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.69336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.69351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.69548: variable 'ansible_distribution' from source: facts 28285 1727204269.69551: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.69716: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.69720: when evaluation is False, skipping this task 28285 1727204269.69722: _execute() done 28285 1727204269.69724: dumping result to json 28285 1727204269.69727: done dumping result, returning 28285 1727204269.69736: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-57a1-d976-000000000112] 28285 1727204269.69742: sending task result for task 0affcd87-79f5-57a1-d976-000000000112 28285 1727204269.69838: done sending task result for task 0affcd87-79f5-57a1-d976-000000000112 28285 1727204269.69841: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204269.69884: no more pending results, returning what we have 28285 1727204269.69887: results queue empty 28285 1727204269.69888: checking for any_errors_fatal 28285 1727204269.69893: done checking for any_errors_fatal 28285 1727204269.69893: checking for max_fail_percentage 28285 1727204269.69895: done checking for max_fail_percentage 28285 1727204269.69896: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.69897: done checking to see if all hosts have failed 28285 1727204269.69897: getting the remaining hosts for this loop 28285 1727204269.69899: done getting the remaining hosts for this loop 28285 1727204269.69903: getting the next task for host managed-node1 28285 1727204269.69908: done getting next task for host managed-node1 28285 1727204269.69913: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204269.69916: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.69932: getting variables 28285 1727204269.69934: in VariableManager get_vars() 28285 1727204269.69989: Calling all_inventory to load vars for managed-node1 28285 1727204269.69992: Calling groups_inventory to load vars for managed-node1 28285 1727204269.69994: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.70004: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.70006: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.70009: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.70238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.70443: done with get_vars() 28285 1727204269.70454: done getting variables 28285 1727204269.71152: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.078) 0:00:10.521 ***** 28285 1727204269.71187: entering _queue_task() for managed-node1/fail 28285 1727204269.71477: worker is 1 (out of 1 available) 28285 1727204269.71488: exiting _queue_task() for managed-node1/fail 28285 1727204269.71500: done queuing things up, now waiting for results queue to drain 28285 1727204269.71502: waiting for pending results... 28285 1727204269.71779: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204269.71898: in run() - task 0affcd87-79f5-57a1-d976-000000000113 28285 1727204269.71910: variable 'ansible_search_path' from source: unknown 28285 1727204269.71914: variable 'ansible_search_path' from source: unknown 28285 1727204269.71951: calling self._execute() 28285 1727204269.72039: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.72045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.72062: variable 'omit' from source: magic vars 28285 1727204269.72522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.75424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.75509: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.75555: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.75592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.75618: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.75703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.75730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.75769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.75816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.75830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.76000: variable 'ansible_distribution' from source: facts 28285 1727204269.76007: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.76025: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.76029: when evaluation is False, skipping this task 28285 1727204269.76032: _execute() done 28285 1727204269.76035: dumping result to json 28285 1727204269.76037: done dumping result, returning 28285 1727204269.76046: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-57a1-d976-000000000113] 28285 1727204269.76054: sending task result for task 0affcd87-79f5-57a1-d976-000000000113 28285 1727204269.76167: done sending task result for task 0affcd87-79f5-57a1-d976-000000000113 28285 1727204269.76171: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.76218: no more pending results, returning what we have 28285 1727204269.76222: results queue empty 28285 1727204269.76223: checking for any_errors_fatal 28285 1727204269.76230: done checking for any_errors_fatal 28285 1727204269.76231: checking for max_fail_percentage 28285 1727204269.76233: done checking for max_fail_percentage 28285 1727204269.76235: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.76235: done checking to see if all hosts have failed 28285 1727204269.76236: getting the remaining hosts for this loop 28285 1727204269.76238: done getting the remaining hosts for this loop 28285 1727204269.76242: getting the next task for host managed-node1 28285 1727204269.76251: done getting next task for host managed-node1 28285 1727204269.76256: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204269.76259: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.76283: getting variables 28285 1727204269.76285: in VariableManager get_vars() 28285 1727204269.76341: Calling all_inventory to load vars for managed-node1 28285 1727204269.76345: Calling groups_inventory to load vars for managed-node1 28285 1727204269.76350: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.76362: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.76366: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.76370: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.76558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.76832: done with get_vars() 28285 1727204269.76843: done getting variables 28285 1727204269.77024: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.058) 0:00:10.580 ***** 28285 1727204269.77068: entering _queue_task() for managed-node1/fail 28285 1727204269.77466: worker is 1 (out of 1 available) 28285 1727204269.77478: exiting _queue_task() for managed-node1/fail 28285 1727204269.77491: done queuing things up, now waiting for results queue to drain 28285 1727204269.77493: waiting for pending results... 28285 1727204269.77775: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204269.77910: in run() - task 0affcd87-79f5-57a1-d976-000000000114 28285 1727204269.77928: variable 'ansible_search_path' from source: unknown 28285 1727204269.77937: variable 'ansible_search_path' from source: unknown 28285 1727204269.77976: calling self._execute() 28285 1727204269.78067: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.78078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.78089: variable 'omit' from source: magic vars 28285 1727204269.78529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.81294: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.81368: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.81417: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.81458: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.81497: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.81577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.81617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.81647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.81697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.81720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.81870: variable 'ansible_distribution' from source: facts 28285 1727204269.81881: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.81902: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.81909: when evaluation is False, skipping this task 28285 1727204269.81917: _execute() done 28285 1727204269.81928: dumping result to json 28285 1727204269.81935: done dumping result, returning 28285 1727204269.81945: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-57a1-d976-000000000114] 28285 1727204269.81959: sending task result for task 0affcd87-79f5-57a1-d976-000000000114 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.82109: no more pending results, returning what we have 28285 1727204269.82113: results queue empty 28285 1727204269.82114: checking for any_errors_fatal 28285 1727204269.82121: done checking for any_errors_fatal 28285 1727204269.82121: checking for max_fail_percentage 28285 1727204269.82123: done checking for max_fail_percentage 28285 1727204269.82124: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.82125: done checking to see if all hosts have failed 28285 1727204269.82126: getting the remaining hosts for this loop 28285 1727204269.82128: done getting the remaining hosts for this loop 28285 1727204269.82132: getting the next task for host managed-node1 28285 1727204269.82138: done getting next task for host managed-node1 28285 1727204269.82143: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204269.82146: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.82168: getting variables 28285 1727204269.82171: in VariableManager get_vars() 28285 1727204269.82226: Calling all_inventory to load vars for managed-node1 28285 1727204269.82229: Calling groups_inventory to load vars for managed-node1 28285 1727204269.82231: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.82242: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.82245: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.82250: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.82495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.82796: done with get_vars() 28285 1727204269.82811: done getting variables 28285 1727204269.82915: done sending task result for task 0affcd87-79f5-57a1-d976-000000000114 28285 1727204269.82919: WORKER PROCESS EXITING 28285 1727204269.82956: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.059) 0:00:10.639 ***** 28285 1727204269.82996: entering _queue_task() for managed-node1/fail 28285 1727204269.83428: worker is 1 (out of 1 available) 28285 1727204269.83442: exiting _queue_task() for managed-node1/fail 28285 1727204269.83462: done queuing things up, now waiting for results queue to drain 28285 1727204269.83466: waiting for pending results... 28285 1727204269.83892: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204269.84093: in run() - task 0affcd87-79f5-57a1-d976-000000000115 28285 1727204269.84119: variable 'ansible_search_path' from source: unknown 28285 1727204269.84131: variable 'ansible_search_path' from source: unknown 28285 1727204269.84178: calling self._execute() 28285 1727204269.84314: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.84345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.84456: variable 'omit' from source: magic vars 28285 1727204269.85435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.90297: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.90431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.90631: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.90689: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.90746: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.90917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.90961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.90996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.91041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.91071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.91215: variable 'ansible_distribution' from source: facts 28285 1727204269.91228: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.91254: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.91262: when evaluation is False, skipping this task 28285 1727204269.91270: _execute() done 28285 1727204269.91283: dumping result to json 28285 1727204269.91291: done dumping result, returning 28285 1727204269.91302: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-57a1-d976-000000000115] 28285 1727204269.91311: sending task result for task 0affcd87-79f5-57a1-d976-000000000115 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.91468: no more pending results, returning what we have 28285 1727204269.91473: results queue empty 28285 1727204269.91473: checking for any_errors_fatal 28285 1727204269.91480: done checking for any_errors_fatal 28285 1727204269.91481: checking for max_fail_percentage 28285 1727204269.91482: done checking for max_fail_percentage 28285 1727204269.91483: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.91484: done checking to see if all hosts have failed 28285 1727204269.91485: getting the remaining hosts for this loop 28285 1727204269.91486: done getting the remaining hosts for this loop 28285 1727204269.91490: getting the next task for host managed-node1 28285 1727204269.91496: done getting next task for host managed-node1 28285 1727204269.91500: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204269.91503: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.91523: getting variables 28285 1727204269.91526: in VariableManager get_vars() 28285 1727204269.91593: Calling all_inventory to load vars for managed-node1 28285 1727204269.91596: Calling groups_inventory to load vars for managed-node1 28285 1727204269.91598: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.91608: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.91610: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.91612: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.91787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.92013: done with get_vars() 28285 1727204269.92025: done getting variables 28285 1727204269.92089: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204269.92227: done sending task result for task 0affcd87-79f5-57a1-d976-000000000115 28285 1727204269.92231: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.092) 0:00:10.732 ***** 28285 1727204269.92247: entering _queue_task() for managed-node1/dnf 28285 1727204269.92702: worker is 1 (out of 1 available) 28285 1727204269.92712: exiting _queue_task() for managed-node1/dnf 28285 1727204269.92723: done queuing things up, now waiting for results queue to drain 28285 1727204269.92725: waiting for pending results... 28285 1727204269.92997: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204269.93132: in run() - task 0affcd87-79f5-57a1-d976-000000000116 28285 1727204269.93152: variable 'ansible_search_path' from source: unknown 28285 1727204269.93161: variable 'ansible_search_path' from source: unknown 28285 1727204269.93201: calling self._execute() 28285 1727204269.93294: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.93304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.93316: variable 'omit' from source: magic vars 28285 1727204269.93766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204269.96389: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204269.96489: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204269.96544: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204269.96606: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204269.96642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204269.96742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204269.96782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204269.96819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204269.96889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204269.96908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204269.97084: variable 'ansible_distribution' from source: facts 28285 1727204269.97109: variable 'ansible_distribution_major_version' from source: facts 28285 1727204269.97138: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204269.97147: when evaluation is False, skipping this task 28285 1727204269.97157: _execute() done 28285 1727204269.97166: dumping result to json 28285 1727204269.97175: done dumping result, returning 28285 1727204269.97192: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000116] 28285 1727204269.97203: sending task result for task 0affcd87-79f5-57a1-d976-000000000116 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204269.97373: no more pending results, returning what we have 28285 1727204269.97377: results queue empty 28285 1727204269.97378: checking for any_errors_fatal 28285 1727204269.97385: done checking for any_errors_fatal 28285 1727204269.97386: checking for max_fail_percentage 28285 1727204269.97388: done checking for max_fail_percentage 28285 1727204269.97389: checking to see if all hosts have failed and the running result is not ok 28285 1727204269.97390: done checking to see if all hosts have failed 28285 1727204269.97391: getting the remaining hosts for this loop 28285 1727204269.97392: done getting the remaining hosts for this loop 28285 1727204269.97396: getting the next task for host managed-node1 28285 1727204269.97403: done getting next task for host managed-node1 28285 1727204269.97408: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204269.97411: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204269.97432: getting variables 28285 1727204269.97434: in VariableManager get_vars() 28285 1727204269.97494: Calling all_inventory to load vars for managed-node1 28285 1727204269.97497: Calling groups_inventory to load vars for managed-node1 28285 1727204269.97500: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204269.97511: Calling all_plugins_play to load vars for managed-node1 28285 1727204269.97514: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204269.97518: Calling groups_plugins_play to load vars for managed-node1 28285 1727204269.97788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204269.98032: done with get_vars() 28285 1727204269.98042: done getting variables 28285 1727204269.98178: done sending task result for task 0affcd87-79f5-57a1-d976-000000000116 28285 1727204269.98182: WORKER PROCESS EXITING redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204269.98243: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.060) 0:00:10.792 ***** 28285 1727204269.98286: entering _queue_task() for managed-node1/yum 28285 1727204269.98717: worker is 1 (out of 1 available) 28285 1727204269.98729: exiting _queue_task() for managed-node1/yum 28285 1727204269.98741: done queuing things up, now waiting for results queue to drain 28285 1727204269.98742: waiting for pending results... 28285 1727204269.99006: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204269.99137: in run() - task 0affcd87-79f5-57a1-d976-000000000117 28285 1727204269.99157: variable 'ansible_search_path' from source: unknown 28285 1727204269.99167: variable 'ansible_search_path' from source: unknown 28285 1727204269.99209: calling self._execute() 28285 1727204269.99296: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204269.99309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204269.99320: variable 'omit' from source: magic vars 28285 1727204269.99786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.02686: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.02793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.02841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.02899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.02932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.03021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.03059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.03099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.03152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.03176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.03359: variable 'ansible_distribution' from source: facts 28285 1727204270.03374: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.03410: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.03420: when evaluation is False, skipping this task 28285 1727204270.03427: _execute() done 28285 1727204270.03437: dumping result to json 28285 1727204270.03450: done dumping result, returning 28285 1727204270.03472: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000117] 28285 1727204270.03484: sending task result for task 0affcd87-79f5-57a1-d976-000000000117 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.03638: no more pending results, returning what we have 28285 1727204270.03642: results queue empty 28285 1727204270.03643: checking for any_errors_fatal 28285 1727204270.03652: done checking for any_errors_fatal 28285 1727204270.03653: checking for max_fail_percentage 28285 1727204270.03655: done checking for max_fail_percentage 28285 1727204270.03656: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.03657: done checking to see if all hosts have failed 28285 1727204270.03657: getting the remaining hosts for this loop 28285 1727204270.03659: done getting the remaining hosts for this loop 28285 1727204270.03663: getting the next task for host managed-node1 28285 1727204270.03671: done getting next task for host managed-node1 28285 1727204270.03675: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204270.03679: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.03697: getting variables 28285 1727204270.03699: in VariableManager get_vars() 28285 1727204270.03757: Calling all_inventory to load vars for managed-node1 28285 1727204270.03761: Calling groups_inventory to load vars for managed-node1 28285 1727204270.03765: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.03776: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.03779: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.03782: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.03979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.04214: done with get_vars() 28285 1727204270.04226: done getting variables 28285 1727204270.04380: done sending task result for task 0affcd87-79f5-57a1-d976-000000000117 28285 1727204270.04383: WORKER PROCESS EXITING 28285 1727204270.04424: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.062) 0:00:10.855 ***** 28285 1727204270.04544: entering _queue_task() for managed-node1/fail 28285 1727204270.04993: worker is 1 (out of 1 available) 28285 1727204270.05005: exiting _queue_task() for managed-node1/fail 28285 1727204270.05017: done queuing things up, now waiting for results queue to drain 28285 1727204270.05018: waiting for pending results... 28285 1727204270.05300: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204270.05472: in run() - task 0affcd87-79f5-57a1-d976-000000000118 28285 1727204270.05572: variable 'ansible_search_path' from source: unknown 28285 1727204270.05581: variable 'ansible_search_path' from source: unknown 28285 1727204270.05623: calling self._execute() 28285 1727204270.05726: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.05738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.05755: variable 'omit' from source: magic vars 28285 1727204270.06225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.08843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.08925: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.08970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.09010: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.09051: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.09140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.09180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.09210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.09270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.09291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.09426: variable 'ansible_distribution' from source: facts 28285 1727204270.09436: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.09469: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.09476: when evaluation is False, skipping this task 28285 1727204270.09482: _execute() done 28285 1727204270.09488: dumping result to json 28285 1727204270.09494: done dumping result, returning 28285 1727204270.09506: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000118] 28285 1727204270.09517: sending task result for task 0affcd87-79f5-57a1-d976-000000000118 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.09695: no more pending results, returning what we have 28285 1727204270.09700: results queue empty 28285 1727204270.09701: checking for any_errors_fatal 28285 1727204270.09710: done checking for any_errors_fatal 28285 1727204270.09711: checking for max_fail_percentage 28285 1727204270.09714: done checking for max_fail_percentage 28285 1727204270.09715: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.09716: done checking to see if all hosts have failed 28285 1727204270.09717: getting the remaining hosts for this loop 28285 1727204270.09719: done getting the remaining hosts for this loop 28285 1727204270.09723: getting the next task for host managed-node1 28285 1727204270.09730: done getting next task for host managed-node1 28285 1727204270.09735: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28285 1727204270.09739: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.09762: getting variables 28285 1727204270.09774: in VariableManager get_vars() 28285 1727204270.09836: Calling all_inventory to load vars for managed-node1 28285 1727204270.09840: Calling groups_inventory to load vars for managed-node1 28285 1727204270.09842: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.09857: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.09860: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.09866: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.10129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.10462: done with get_vars() 28285 1727204270.10476: done getting variables 28285 1727204270.10540: done sending task result for task 0affcd87-79f5-57a1-d976-000000000118 28285 1727204270.10543: WORKER PROCESS EXITING 28285 1727204270.10589: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.060) 0:00:10.916 ***** 28285 1727204270.10629: entering _queue_task() for managed-node1/package 28285 1727204270.11098: worker is 1 (out of 1 available) 28285 1727204270.11110: exiting _queue_task() for managed-node1/package 28285 1727204270.11123: done queuing things up, now waiting for results queue to drain 28285 1727204270.11125: waiting for pending results... 28285 1727204270.11413: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 28285 1727204270.11559: in run() - task 0affcd87-79f5-57a1-d976-000000000119 28285 1727204270.11586: variable 'ansible_search_path' from source: unknown 28285 1727204270.11595: variable 'ansible_search_path' from source: unknown 28285 1727204270.11640: calling self._execute() 28285 1727204270.11730: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.11744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.11766: variable 'omit' from source: magic vars 28285 1727204270.12250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.14841: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.15195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.15242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.15288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.15318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.15407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.15439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.15482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.15528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.15550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.15710: variable 'ansible_distribution' from source: facts 28285 1727204270.15721: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.15741: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.15750: when evaluation is False, skipping this task 28285 1727204270.15757: _execute() done 28285 1727204270.15763: dumping result to json 28285 1727204270.15775: done dumping result, returning 28285 1727204270.15786: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-57a1-d976-000000000119] 28285 1727204270.15795: sending task result for task 0affcd87-79f5-57a1-d976-000000000119 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.15962: no more pending results, returning what we have 28285 1727204270.15968: results queue empty 28285 1727204270.15969: checking for any_errors_fatal 28285 1727204270.15975: done checking for any_errors_fatal 28285 1727204270.15976: checking for max_fail_percentage 28285 1727204270.15978: done checking for max_fail_percentage 28285 1727204270.15979: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.15980: done checking to see if all hosts have failed 28285 1727204270.15981: getting the remaining hosts for this loop 28285 1727204270.15983: done getting the remaining hosts for this loop 28285 1727204270.15987: getting the next task for host managed-node1 28285 1727204270.15994: done getting next task for host managed-node1 28285 1727204270.15998: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204270.16001: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.16022: getting variables 28285 1727204270.16024: in VariableManager get_vars() 28285 1727204270.16085: Calling all_inventory to load vars for managed-node1 28285 1727204270.16088: Calling groups_inventory to load vars for managed-node1 28285 1727204270.16090: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.16100: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.16102: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.16104: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.16255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.16485: done with get_vars() 28285 1727204270.16497: done getting variables 28285 1727204270.16670: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204270.16702: done sending task result for task 0affcd87-79f5-57a1-d976-000000000119 28285 1727204270.16706: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.060) 0:00:10.977 ***** 28285 1727204270.16722: entering _queue_task() for managed-node1/package 28285 1727204270.17232: worker is 1 (out of 1 available) 28285 1727204270.17251: exiting _queue_task() for managed-node1/package 28285 1727204270.17266: done queuing things up, now waiting for results queue to drain 28285 1727204270.17268: waiting for pending results... 28285 1727204270.17551: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204270.17699: in run() - task 0affcd87-79f5-57a1-d976-00000000011a 28285 1727204270.17723: variable 'ansible_search_path' from source: unknown 28285 1727204270.17731: variable 'ansible_search_path' from source: unknown 28285 1727204270.17778: calling self._execute() 28285 1727204270.17877: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.17894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.17909: variable 'omit' from source: magic vars 28285 1727204270.18384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.21207: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.21284: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.21328: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.21375: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.21409: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.21494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.21531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.21570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.21618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.21639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.21794: variable 'ansible_distribution' from source: facts 28285 1727204270.21806: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.21826: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.21833: when evaluation is False, skipping this task 28285 1727204270.21843: _execute() done 28285 1727204270.21851: dumping result to json 28285 1727204270.21858: done dumping result, returning 28285 1727204270.21871: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-57a1-d976-00000000011a] 28285 1727204270.21881: sending task result for task 0affcd87-79f5-57a1-d976-00000000011a skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.22035: no more pending results, returning what we have 28285 1727204270.22039: results queue empty 28285 1727204270.22039: checking for any_errors_fatal 28285 1727204270.22046: done checking for any_errors_fatal 28285 1727204270.22047: checking for max_fail_percentage 28285 1727204270.22051: done checking for max_fail_percentage 28285 1727204270.22052: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.22053: done checking to see if all hosts have failed 28285 1727204270.22054: getting the remaining hosts for this loop 28285 1727204270.22056: done getting the remaining hosts for this loop 28285 1727204270.22059: getting the next task for host managed-node1 28285 1727204270.22068: done getting next task for host managed-node1 28285 1727204270.22072: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204270.22075: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.22093: getting variables 28285 1727204270.22095: in VariableManager get_vars() 28285 1727204270.22147: Calling all_inventory to load vars for managed-node1 28285 1727204270.22153: Calling groups_inventory to load vars for managed-node1 28285 1727204270.22155: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.22167: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.22170: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.22173: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.22421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.22651: done with get_vars() 28285 1727204270.22663: done getting variables 28285 1727204270.22812: done sending task result for task 0affcd87-79f5-57a1-d976-00000000011a 28285 1727204270.22815: WORKER PROCESS EXITING 28285 1727204270.22855: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.061) 0:00:11.038 ***** 28285 1727204270.22895: entering _queue_task() for managed-node1/package 28285 1727204270.23374: worker is 1 (out of 1 available) 28285 1727204270.23388: exiting _queue_task() for managed-node1/package 28285 1727204270.23402: done queuing things up, now waiting for results queue to drain 28285 1727204270.23403: waiting for pending results... 28285 1727204270.23704: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204270.23851: in run() - task 0affcd87-79f5-57a1-d976-00000000011b 28285 1727204270.23874: variable 'ansible_search_path' from source: unknown 28285 1727204270.23882: variable 'ansible_search_path' from source: unknown 28285 1727204270.23928: calling self._execute() 28285 1727204270.24046: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.24065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.24105: variable 'omit' from source: magic vars 28285 1727204270.24598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.28385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.28462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.28518: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.28575: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.28611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.28702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.28737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.28777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.28829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.28858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.29012: variable 'ansible_distribution' from source: facts 28285 1727204270.29029: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.29055: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.29065: when evaluation is False, skipping this task 28285 1727204270.29075: _execute() done 28285 1727204270.29081: dumping result to json 28285 1727204270.29089: done dumping result, returning 28285 1727204270.29101: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-57a1-d976-00000000011b] 28285 1727204270.29112: sending task result for task 0affcd87-79f5-57a1-d976-00000000011b skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.29279: no more pending results, returning what we have 28285 1727204270.29283: results queue empty 28285 1727204270.29284: checking for any_errors_fatal 28285 1727204270.29291: done checking for any_errors_fatal 28285 1727204270.29292: checking for max_fail_percentage 28285 1727204270.29294: done checking for max_fail_percentage 28285 1727204270.29295: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.29296: done checking to see if all hosts have failed 28285 1727204270.29297: getting the remaining hosts for this loop 28285 1727204270.29299: done getting the remaining hosts for this loop 28285 1727204270.29303: getting the next task for host managed-node1 28285 1727204270.29310: done getting next task for host managed-node1 28285 1727204270.29314: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204270.29317: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.29335: getting variables 28285 1727204270.29337: in VariableManager get_vars() 28285 1727204270.29399: Calling all_inventory to load vars for managed-node1 28285 1727204270.29402: Calling groups_inventory to load vars for managed-node1 28285 1727204270.29405: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.29415: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.29418: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.29422: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.29614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.29842: done with get_vars() 28285 1727204270.29856: done getting variables 28285 1727204270.29937: done sending task result for task 0affcd87-79f5-57a1-d976-00000000011b 28285 1727204270.29940: WORKER PROCESS EXITING 28285 1727204270.29981: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.071) 0:00:11.110 ***** 28285 1727204270.30228: entering _queue_task() for managed-node1/service 28285 1727204270.30673: worker is 1 (out of 1 available) 28285 1727204270.30685: exiting _queue_task() for managed-node1/service 28285 1727204270.30702: done queuing things up, now waiting for results queue to drain 28285 1727204270.30703: waiting for pending results... 28285 1727204270.30981: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204270.31115: in run() - task 0affcd87-79f5-57a1-d976-00000000011c 28285 1727204270.31140: variable 'ansible_search_path' from source: unknown 28285 1727204270.31153: variable 'ansible_search_path' from source: unknown 28285 1727204270.31193: calling self._execute() 28285 1727204270.31287: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.31296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.31308: variable 'omit' from source: magic vars 28285 1727204270.31761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.37311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.37431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.37560: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.37605: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.37655: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.37814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.37900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.37983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.38106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.38127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.38425: variable 'ansible_distribution' from source: facts 28285 1727204270.38478: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.38506: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.38602: when evaluation is False, skipping this task 28285 1727204270.38610: _execute() done 28285 1727204270.38617: dumping result to json 28285 1727204270.38625: done dumping result, returning 28285 1727204270.38637: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-00000000011c] 28285 1727204270.38651: sending task result for task 0affcd87-79f5-57a1-d976-00000000011c skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.38803: no more pending results, returning what we have 28285 1727204270.38807: results queue empty 28285 1727204270.38808: checking for any_errors_fatal 28285 1727204270.38815: done checking for any_errors_fatal 28285 1727204270.38816: checking for max_fail_percentage 28285 1727204270.38818: done checking for max_fail_percentage 28285 1727204270.38819: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.38819: done checking to see if all hosts have failed 28285 1727204270.38820: getting the remaining hosts for this loop 28285 1727204270.38822: done getting the remaining hosts for this loop 28285 1727204270.38827: getting the next task for host managed-node1 28285 1727204270.38834: done getting next task for host managed-node1 28285 1727204270.38838: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204270.38841: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.38863: getting variables 28285 1727204270.38866: in VariableManager get_vars() 28285 1727204270.38921: Calling all_inventory to load vars for managed-node1 28285 1727204270.38924: Calling groups_inventory to load vars for managed-node1 28285 1727204270.38927: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.38938: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.38941: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.38945: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.39190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.39409: done with get_vars() 28285 1727204270.39420: done getting variables 28285 1727204270.39770: done sending task result for task 0affcd87-79f5-57a1-d976-00000000011c 28285 1727204270.39774: WORKER PROCESS EXITING 28285 1727204270.39812: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.098) 0:00:11.208 ***** 28285 1727204270.39851: entering _queue_task() for managed-node1/service 28285 1727204270.40507: worker is 1 (out of 1 available) 28285 1727204270.40518: exiting _queue_task() for managed-node1/service 28285 1727204270.40529: done queuing things up, now waiting for results queue to drain 28285 1727204270.40531: waiting for pending results... 28285 1727204270.41404: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204270.41630: in run() - task 0affcd87-79f5-57a1-d976-00000000011d 28285 1727204270.41642: variable 'ansible_search_path' from source: unknown 28285 1727204270.41645: variable 'ansible_search_path' from source: unknown 28285 1727204270.41793: calling self._execute() 28285 1727204270.41874: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.41880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.41889: variable 'omit' from source: magic vars 28285 1727204270.42757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.48598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.48676: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.48851: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.48894: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.48930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.49105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.49167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.49272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.49317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.49386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.49640: variable 'ansible_distribution' from source: facts 28285 1727204270.49790: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.49811: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.49818: when evaluation is False, skipping this task 28285 1727204270.49824: _execute() done 28285 1727204270.49830: dumping result to json 28285 1727204270.49836: done dumping result, returning 28285 1727204270.49851: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-57a1-d976-00000000011d] 28285 1727204270.49865: sending task result for task 0affcd87-79f5-57a1-d976-00000000011d 28285 1727204270.49987: done sending task result for task 0affcd87-79f5-57a1-d976-00000000011d skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204270.50037: no more pending results, returning what we have 28285 1727204270.50041: results queue empty 28285 1727204270.50042: checking for any_errors_fatal 28285 1727204270.50051: done checking for any_errors_fatal 28285 1727204270.50053: checking for max_fail_percentage 28285 1727204270.50055: done checking for max_fail_percentage 28285 1727204270.50056: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.50057: done checking to see if all hosts have failed 28285 1727204270.50058: getting the remaining hosts for this loop 28285 1727204270.50060: done getting the remaining hosts for this loop 28285 1727204270.50065: getting the next task for host managed-node1 28285 1727204270.50072: done getting next task for host managed-node1 28285 1727204270.50076: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204270.50079: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.50097: getting variables 28285 1727204270.50099: in VariableManager get_vars() 28285 1727204270.50158: Calling all_inventory to load vars for managed-node1 28285 1727204270.50162: Calling groups_inventory to load vars for managed-node1 28285 1727204270.50167: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.50178: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.50181: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.50184: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.50368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.50581: done with get_vars() 28285 1727204270.50593: done getting variables 28285 1727204270.50656: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.108) 0:00:11.316 ***** 28285 1727204270.50698: entering _queue_task() for managed-node1/service 28285 1727204270.50716: WORKER PROCESS EXITING 28285 1727204270.51347: worker is 1 (out of 1 available) 28285 1727204270.51359: exiting _queue_task() for managed-node1/service 28285 1727204270.51374: done queuing things up, now waiting for results queue to drain 28285 1727204270.51376: waiting for pending results... 28285 1727204270.51654: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204270.51787: in run() - task 0affcd87-79f5-57a1-d976-00000000011e 28285 1727204270.51808: variable 'ansible_search_path' from source: unknown 28285 1727204270.51822: variable 'ansible_search_path' from source: unknown 28285 1727204270.51865: calling self._execute() 28285 1727204270.51959: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.51973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.51987: variable 'omit' from source: magic vars 28285 1727204270.52425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.55526: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.55602: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.55644: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.55693: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.55723: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.55811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.55921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.55951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.56052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.56085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.56251: variable 'ansible_distribution' from source: facts 28285 1727204270.56263: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.56288: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.56296: when evaluation is False, skipping this task 28285 1727204270.56303: _execute() done 28285 1727204270.56308: dumping result to json 28285 1727204270.56315: done dumping result, returning 28285 1727204270.56325: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-57a1-d976-00000000011e] 28285 1727204270.56342: sending task result for task 0affcd87-79f5-57a1-d976-00000000011e skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.56500: no more pending results, returning what we have 28285 1727204270.56503: results queue empty 28285 1727204270.56504: checking for any_errors_fatal 28285 1727204270.56511: done checking for any_errors_fatal 28285 1727204270.56511: checking for max_fail_percentage 28285 1727204270.56513: done checking for max_fail_percentage 28285 1727204270.56515: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.56515: done checking to see if all hosts have failed 28285 1727204270.56516: getting the remaining hosts for this loop 28285 1727204270.56518: done getting the remaining hosts for this loop 28285 1727204270.56522: getting the next task for host managed-node1 28285 1727204270.56528: done getting next task for host managed-node1 28285 1727204270.56532: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204270.56536: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.56555: getting variables 28285 1727204270.56558: in VariableManager get_vars() 28285 1727204270.56613: Calling all_inventory to load vars for managed-node1 28285 1727204270.56617: Calling groups_inventory to load vars for managed-node1 28285 1727204270.56619: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.56630: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.56632: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.56635: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.57074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.57482: done with get_vars() 28285 1727204270.57492: done getting variables 28285 1727204270.57625: done sending task result for task 0affcd87-79f5-57a1-d976-00000000011e 28285 1727204270.57629: WORKER PROCESS EXITING 28285 1727204270.57671: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.070) 0:00:11.387 ***** 28285 1727204270.57701: entering _queue_task() for managed-node1/service 28285 1727204270.57977: worker is 1 (out of 1 available) 28285 1727204270.57989: exiting _queue_task() for managed-node1/service 28285 1727204270.58000: done queuing things up, now waiting for results queue to drain 28285 1727204270.58002: waiting for pending results... 28285 1727204270.59050: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204270.59194: in run() - task 0affcd87-79f5-57a1-d976-00000000011f 28285 1727204270.59212: variable 'ansible_search_path' from source: unknown 28285 1727204270.59220: variable 'ansible_search_path' from source: unknown 28285 1727204270.59271: calling self._execute() 28285 1727204270.59371: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.59383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.59396: variable 'omit' from source: magic vars 28285 1727204270.59857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.62442: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.62537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.62586: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.62632: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.62667: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.62755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.62791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.62821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.62878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.62896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.63043: variable 'ansible_distribution' from source: facts 28285 1727204270.63066: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.63088: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.63095: when evaluation is False, skipping this task 28285 1727204270.63101: _execute() done 28285 1727204270.63106: dumping result to json 28285 1727204270.63112: done dumping result, returning 28285 1727204270.63123: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-57a1-d976-00000000011f] 28285 1727204270.63132: sending task result for task 0affcd87-79f5-57a1-d976-00000000011f 28285 1727204270.63242: done sending task result for task 0affcd87-79f5-57a1-d976-00000000011f 28285 1727204270.63251: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204270.63311: no more pending results, returning what we have 28285 1727204270.63315: results queue empty 28285 1727204270.63316: checking for any_errors_fatal 28285 1727204270.63326: done checking for any_errors_fatal 28285 1727204270.63326: checking for max_fail_percentage 28285 1727204270.63328: done checking for max_fail_percentage 28285 1727204270.63329: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.63330: done checking to see if all hosts have failed 28285 1727204270.63331: getting the remaining hosts for this loop 28285 1727204270.63333: done getting the remaining hosts for this loop 28285 1727204270.63337: getting the next task for host managed-node1 28285 1727204270.63343: done getting next task for host managed-node1 28285 1727204270.63351: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204270.63354: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.63376: getting variables 28285 1727204270.63378: in VariableManager get_vars() 28285 1727204270.63436: Calling all_inventory to load vars for managed-node1 28285 1727204270.63439: Calling groups_inventory to load vars for managed-node1 28285 1727204270.63441: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.63455: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.63458: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.63462: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.63659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.63896: done with get_vars() 28285 1727204270.64022: done getting variables 28285 1727204270.64087: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.065) 0:00:11.452 ***** 28285 1727204270.64237: entering _queue_task() for managed-node1/copy 28285 1727204270.64540: worker is 1 (out of 1 available) 28285 1727204270.64559: exiting _queue_task() for managed-node1/copy 28285 1727204270.64572: done queuing things up, now waiting for results queue to drain 28285 1727204270.64574: waiting for pending results... 28285 1727204270.64847: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204270.64994: in run() - task 0affcd87-79f5-57a1-d976-000000000120 28285 1727204270.65015: variable 'ansible_search_path' from source: unknown 28285 1727204270.65024: variable 'ansible_search_path' from source: unknown 28285 1727204270.65066: calling self._execute() 28285 1727204270.65158: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.65172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.65187: variable 'omit' from source: magic vars 28285 1727204270.65645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.69241: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.69317: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.69409: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.69499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.69593: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.69744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.69818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.69913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.70034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.70055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.70439: variable 'ansible_distribution' from source: facts 28285 1727204270.70454: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.70480: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.70487: when evaluation is False, skipping this task 28285 1727204270.70494: _execute() done 28285 1727204270.70500: dumping result to json 28285 1727204270.70506: done dumping result, returning 28285 1727204270.70518: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-57a1-d976-000000000120] 28285 1727204270.70534: sending task result for task 0affcd87-79f5-57a1-d976-000000000120 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.70719: no more pending results, returning what we have 28285 1727204270.70723: results queue empty 28285 1727204270.70724: checking for any_errors_fatal 28285 1727204270.70729: done checking for any_errors_fatal 28285 1727204270.70730: checking for max_fail_percentage 28285 1727204270.70732: done checking for max_fail_percentage 28285 1727204270.70733: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.70734: done checking to see if all hosts have failed 28285 1727204270.70735: getting the remaining hosts for this loop 28285 1727204270.70737: done getting the remaining hosts for this loop 28285 1727204270.70740: getting the next task for host managed-node1 28285 1727204270.70750: done getting next task for host managed-node1 28285 1727204270.70755: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204270.70758: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.70781: getting variables 28285 1727204270.70783: in VariableManager get_vars() 28285 1727204270.70844: Calling all_inventory to load vars for managed-node1 28285 1727204270.70847: Calling groups_inventory to load vars for managed-node1 28285 1727204270.70853: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.70866: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.70869: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.70872: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.71126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.71465: done with get_vars() 28285 1727204270.71624: done getting variables 28285 1727204270.71681: done sending task result for task 0affcd87-79f5-57a1-d976-000000000120 28285 1727204270.71685: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.075) 0:00:11.527 ***** 28285 1727204270.71768: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204270.72303: worker is 1 (out of 1 available) 28285 1727204270.72316: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204270.72329: done queuing things up, now waiting for results queue to drain 28285 1727204270.72330: waiting for pending results... 28285 1727204270.73173: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204270.73397: in run() - task 0affcd87-79f5-57a1-d976-000000000121 28285 1727204270.73414: variable 'ansible_search_path' from source: unknown 28285 1727204270.73422: variable 'ansible_search_path' from source: unknown 28285 1727204270.73577: calling self._execute() 28285 1727204270.73670: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.73780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.73799: variable 'omit' from source: magic vars 28285 1727204270.74530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.79767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.79859: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.79908: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.79947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.79988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.80075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.80114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.80144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.80196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.80221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.80379: variable 'ansible_distribution' from source: facts 28285 1727204270.80391: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.80415: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.80422: when evaluation is False, skipping this task 28285 1727204270.80435: _execute() done 28285 1727204270.80442: dumping result to json 28285 1727204270.80451: done dumping result, returning 28285 1727204270.80462: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-57a1-d976-000000000121] 28285 1727204270.80478: sending task result for task 0affcd87-79f5-57a1-d976-000000000121 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.80636: no more pending results, returning what we have 28285 1727204270.80640: results queue empty 28285 1727204270.80641: checking for any_errors_fatal 28285 1727204270.80650: done checking for any_errors_fatal 28285 1727204270.80652: checking for max_fail_percentage 28285 1727204270.80653: done checking for max_fail_percentage 28285 1727204270.80654: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.80655: done checking to see if all hosts have failed 28285 1727204270.80656: getting the remaining hosts for this loop 28285 1727204270.80658: done getting the remaining hosts for this loop 28285 1727204270.80663: getting the next task for host managed-node1 28285 1727204270.80671: done getting next task for host managed-node1 28285 1727204270.80676: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204270.80679: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.80699: getting variables 28285 1727204270.80701: in VariableManager get_vars() 28285 1727204270.80758: Calling all_inventory to load vars for managed-node1 28285 1727204270.80762: Calling groups_inventory to load vars for managed-node1 28285 1727204270.80769: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.80780: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.80783: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.80785: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.80974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.81211: done with get_vars() 28285 1727204270.81223: done getting variables 28285 1727204270.81276: done sending task result for task 0affcd87-79f5-57a1-d976-000000000121 28285 1727204270.81284: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.097) 0:00:11.624 ***** 28285 1727204270.81484: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204270.81905: worker is 1 (out of 1 available) 28285 1727204270.81917: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204270.81929: done queuing things up, now waiting for results queue to drain 28285 1727204270.81931: waiting for pending results... 28285 1727204270.82517: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204270.82623: in run() - task 0affcd87-79f5-57a1-d976-000000000122 28285 1727204270.82635: variable 'ansible_search_path' from source: unknown 28285 1727204270.83575: variable 'ansible_search_path' from source: unknown 28285 1727204270.83618: calling self._execute() 28285 1727204270.83706: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.83710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.83721: variable 'omit' from source: magic vars 28285 1727204270.84108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204270.89060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204270.89124: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204270.89161: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204270.90004: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204270.90031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204270.90111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204270.90138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204270.90164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204270.90206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204270.90219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204270.90360: variable 'ansible_distribution' from source: facts 28285 1727204270.90368: variable 'ansible_distribution_major_version' from source: facts 28285 1727204270.90387: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204270.90390: when evaluation is False, skipping this task 28285 1727204270.90393: _execute() done 28285 1727204270.90395: dumping result to json 28285 1727204270.90397: done dumping result, returning 28285 1727204270.90406: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-57a1-d976-000000000122] 28285 1727204270.90412: sending task result for task 0affcd87-79f5-57a1-d976-000000000122 28285 1727204270.90507: done sending task result for task 0affcd87-79f5-57a1-d976-000000000122 28285 1727204270.90510: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204270.90562: no more pending results, returning what we have 28285 1727204270.90567: results queue empty 28285 1727204270.90568: checking for any_errors_fatal 28285 1727204270.90574: done checking for any_errors_fatal 28285 1727204270.90574: checking for max_fail_percentage 28285 1727204270.90576: done checking for max_fail_percentage 28285 1727204270.90577: checking to see if all hosts have failed and the running result is not ok 28285 1727204270.90578: done checking to see if all hosts have failed 28285 1727204270.90578: getting the remaining hosts for this loop 28285 1727204270.90580: done getting the remaining hosts for this loop 28285 1727204270.90584: getting the next task for host managed-node1 28285 1727204270.90590: done getting next task for host managed-node1 28285 1727204270.90594: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204270.90597: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204270.90615: getting variables 28285 1727204270.90616: in VariableManager get_vars() 28285 1727204270.90672: Calling all_inventory to load vars for managed-node1 28285 1727204270.90675: Calling groups_inventory to load vars for managed-node1 28285 1727204270.90677: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204270.90686: Calling all_plugins_play to load vars for managed-node1 28285 1727204270.90689: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204270.90691: Calling groups_plugins_play to load vars for managed-node1 28285 1727204270.90934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204270.91155: done with get_vars() 28285 1727204270.91206: done getting variables 28285 1727204270.91270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.098) 0:00:11.723 ***** 28285 1727204270.91303: entering _queue_task() for managed-node1/debug 28285 1727204270.91927: worker is 1 (out of 1 available) 28285 1727204270.91940: exiting _queue_task() for managed-node1/debug 28285 1727204270.91955: done queuing things up, now waiting for results queue to drain 28285 1727204270.91957: waiting for pending results... 28285 1727204270.93174: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204270.93568: in run() - task 0affcd87-79f5-57a1-d976-000000000123 28285 1727204270.93954: variable 'ansible_search_path' from source: unknown 28285 1727204270.93963: variable 'ansible_search_path' from source: unknown 28285 1727204270.94102: calling self._execute() 28285 1727204270.94302: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204270.94314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204270.94328: variable 'omit' from source: magic vars 28285 1727204270.95662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.00721: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.00810: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.00849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.00887: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.00913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.00988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.01449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.01485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.01533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.01554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.01753: variable 'ansible_distribution' from source: facts 28285 1727204271.01767: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.01791: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.01799: when evaluation is False, skipping this task 28285 1727204271.01806: _execute() done 28285 1727204271.01813: dumping result to json 28285 1727204271.01820: done dumping result, returning 28285 1727204271.01832: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-57a1-d976-000000000123] 28285 1727204271.01842: sending task result for task 0affcd87-79f5-57a1-d976-000000000123 skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204271.02005: no more pending results, returning what we have 28285 1727204271.02009: results queue empty 28285 1727204271.02010: checking for any_errors_fatal 28285 1727204271.02016: done checking for any_errors_fatal 28285 1727204271.02017: checking for max_fail_percentage 28285 1727204271.02019: done checking for max_fail_percentage 28285 1727204271.02020: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.02021: done checking to see if all hosts have failed 28285 1727204271.02022: getting the remaining hosts for this loop 28285 1727204271.02024: done getting the remaining hosts for this loop 28285 1727204271.02029: getting the next task for host managed-node1 28285 1727204271.02036: done getting next task for host managed-node1 28285 1727204271.02040: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204271.02044: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204271.02067: getting variables 28285 1727204271.02069: in VariableManager get_vars() 28285 1727204271.02126: Calling all_inventory to load vars for managed-node1 28285 1727204271.02130: Calling groups_inventory to load vars for managed-node1 28285 1727204271.02132: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.02143: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.02146: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.02151: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.02332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.02566: done with get_vars() 28285 1727204271.02580: done getting variables 28285 1727204271.02726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.114) 0:00:11.837 ***** 28285 1727204271.02766: entering _queue_task() for managed-node1/debug 28285 1727204271.02781: done sending task result for task 0affcd87-79f5-57a1-d976-000000000123 28285 1727204271.02784: WORKER PROCESS EXITING 28285 1727204271.03190: worker is 1 (out of 1 available) 28285 1727204271.03214: exiting _queue_task() for managed-node1/debug 28285 1727204271.03229: done queuing things up, now waiting for results queue to drain 28285 1727204271.03230: waiting for pending results... 28285 1727204271.04074: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204271.04214: in run() - task 0affcd87-79f5-57a1-d976-000000000124 28285 1727204271.04232: variable 'ansible_search_path' from source: unknown 28285 1727204271.04236: variable 'ansible_search_path' from source: unknown 28285 1727204271.04278: calling self._execute() 28285 1727204271.04361: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.04367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.04376: variable 'omit' from source: magic vars 28285 1727204271.04862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.07619: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.07689: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.07729: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.07770: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.07795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.07902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.08031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.08034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.08181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.08197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.08345: variable 'ansible_distribution' from source: facts 28285 1727204271.08354: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.08376: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.08379: when evaluation is False, skipping this task 28285 1727204271.08381: _execute() done 28285 1727204271.08383: dumping result to json 28285 1727204271.08386: done dumping result, returning 28285 1727204271.08400: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-57a1-d976-000000000124] 28285 1727204271.08406: sending task result for task 0affcd87-79f5-57a1-d976-000000000124 28285 1727204271.08495: done sending task result for task 0affcd87-79f5-57a1-d976-000000000124 28285 1727204271.08499: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204271.08539: no more pending results, returning what we have 28285 1727204271.08544: results queue empty 28285 1727204271.08545: checking for any_errors_fatal 28285 1727204271.08551: done checking for any_errors_fatal 28285 1727204271.08552: checking for max_fail_percentage 28285 1727204271.08553: done checking for max_fail_percentage 28285 1727204271.08554: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.08555: done checking to see if all hosts have failed 28285 1727204271.08556: getting the remaining hosts for this loop 28285 1727204271.08558: done getting the remaining hosts for this loop 28285 1727204271.08561: getting the next task for host managed-node1 28285 1727204271.08568: done getting next task for host managed-node1 28285 1727204271.08572: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204271.08575: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204271.08592: getting variables 28285 1727204271.08593: in VariableManager get_vars() 28285 1727204271.08645: Calling all_inventory to load vars for managed-node1 28285 1727204271.08648: Calling groups_inventory to load vars for managed-node1 28285 1727204271.08651: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.08660: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.08662: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.08668: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.08895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.09093: done with get_vars() 28285 1727204271.09103: done getting variables 28285 1727204271.09159: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.065) 0:00:11.903 ***** 28285 1727204271.09303: entering _queue_task() for managed-node1/debug 28285 1727204271.09802: worker is 1 (out of 1 available) 28285 1727204271.09834: exiting _queue_task() for managed-node1/debug 28285 1727204271.09846: done queuing things up, now waiting for results queue to drain 28285 1727204271.09851: waiting for pending results... 28285 1727204271.10152: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204271.10285: in run() - task 0affcd87-79f5-57a1-d976-000000000125 28285 1727204271.10302: variable 'ansible_search_path' from source: unknown 28285 1727204271.10306: variable 'ansible_search_path' from source: unknown 28285 1727204271.10342: calling self._execute() 28285 1727204271.10443: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.10447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.10463: variable 'omit' from source: magic vars 28285 1727204271.10953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.13555: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.13641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.13684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.13726: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.13756: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.13838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.13874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.13904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.13951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.13975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.14171: variable 'ansible_distribution' from source: facts 28285 1727204271.14185: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.14212: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.14219: when evaluation is False, skipping this task 28285 1727204271.14225: _execute() done 28285 1727204271.14231: dumping result to json 28285 1727204271.14237: done dumping result, returning 28285 1727204271.14248: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-57a1-d976-000000000125] 28285 1727204271.14258: sending task result for task 0affcd87-79f5-57a1-d976-000000000125 skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204271.14408: no more pending results, returning what we have 28285 1727204271.14412: results queue empty 28285 1727204271.14413: checking for any_errors_fatal 28285 1727204271.14418: done checking for any_errors_fatal 28285 1727204271.14419: checking for max_fail_percentage 28285 1727204271.14421: done checking for max_fail_percentage 28285 1727204271.14422: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.14423: done checking to see if all hosts have failed 28285 1727204271.14424: getting the remaining hosts for this loop 28285 1727204271.14427: done getting the remaining hosts for this loop 28285 1727204271.14432: getting the next task for host managed-node1 28285 1727204271.14439: done getting next task for host managed-node1 28285 1727204271.14444: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204271.14448: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204271.14468: getting variables 28285 1727204271.14470: in VariableManager get_vars() 28285 1727204271.14529: Calling all_inventory to load vars for managed-node1 28285 1727204271.14533: Calling groups_inventory to load vars for managed-node1 28285 1727204271.14535: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.14547: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.14550: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.14553: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.14768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.15059: done with get_vars() 28285 1727204271.15072: done getting variables 28285 1727204271.15211: done sending task result for task 0affcd87-79f5-57a1-d976-000000000125 28285 1727204271.15214: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.059) 0:00:11.962 ***** 28285 1727204271.15296: entering _queue_task() for managed-node1/ping 28285 1727204271.15767: worker is 1 (out of 1 available) 28285 1727204271.15785: exiting _queue_task() for managed-node1/ping 28285 1727204271.15798: done queuing things up, now waiting for results queue to drain 28285 1727204271.15803: waiting for pending results... 28285 1727204271.16094: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204271.16243: in run() - task 0affcd87-79f5-57a1-d976-000000000126 28285 1727204271.16247: variable 'ansible_search_path' from source: unknown 28285 1727204271.16249: variable 'ansible_search_path' from source: unknown 28285 1727204271.16252: calling self._execute() 28285 1727204271.16295: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.16302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.16312: variable 'omit' from source: magic vars 28285 1727204271.17454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.20223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.20304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.20330: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.20369: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.20390: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.20462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.20487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.20504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.20530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.20540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.20650: variable 'ansible_distribution' from source: facts 28285 1727204271.20664: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.20681: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.20685: when evaluation is False, skipping this task 28285 1727204271.20687: _execute() done 28285 1727204271.20689: dumping result to json 28285 1727204271.20691: done dumping result, returning 28285 1727204271.20700: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-57a1-d976-000000000126] 28285 1727204271.20705: sending task result for task 0affcd87-79f5-57a1-d976-000000000126 28285 1727204271.20791: done sending task result for task 0affcd87-79f5-57a1-d976-000000000126 28285 1727204271.20794: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.20836: no more pending results, returning what we have 28285 1727204271.20839: results queue empty 28285 1727204271.20840: checking for any_errors_fatal 28285 1727204271.20846: done checking for any_errors_fatal 28285 1727204271.20846: checking for max_fail_percentage 28285 1727204271.20848: done checking for max_fail_percentage 28285 1727204271.20849: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.20850: done checking to see if all hosts have failed 28285 1727204271.20850: getting the remaining hosts for this loop 28285 1727204271.20852: done getting the remaining hosts for this loop 28285 1727204271.20856: getting the next task for host managed-node1 28285 1727204271.20866: done getting next task for host managed-node1 28285 1727204271.20868: ^ task is: TASK: meta (role_complete) 28285 1727204271.20871: ^ state is: HOST STATE: block=3, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204271.20894: getting variables 28285 1727204271.20900: in VariableManager get_vars() 28285 1727204271.20961: Calling all_inventory to load vars for managed-node1 28285 1727204271.20970: Calling groups_inventory to load vars for managed-node1 28285 1727204271.20973: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.20984: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.20986: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.20988: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.21202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.21468: done with get_vars() 28285 1727204271.21479: done getting variables 28285 1727204271.21562: done queuing things up, now waiting for results queue to drain 28285 1727204271.21567: results queue empty 28285 1727204271.21568: checking for any_errors_fatal 28285 1727204271.21570: done checking for any_errors_fatal 28285 1727204271.21571: checking for max_fail_percentage 28285 1727204271.21572: done checking for max_fail_percentage 28285 1727204271.21573: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.21574: done checking to see if all hosts have failed 28285 1727204271.21575: getting the remaining hosts for this loop 28285 1727204271.21575: done getting the remaining hosts for this loop 28285 1727204271.21578: getting the next task for host managed-node1 28285 1727204271.21582: done getting next task for host managed-node1 28285 1727204271.21584: ^ task is: TASK: Get current device features 28285 1727204271.21586: ^ state is: HOST STATE: block=3, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204271.21588: getting variables 28285 1727204271.21589: in VariableManager get_vars() 28285 1727204271.21609: Calling all_inventory to load vars for managed-node1 28285 1727204271.21611: Calling groups_inventory to load vars for managed-node1 28285 1727204271.21613: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.21622: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.21625: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.21628: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.21771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.22064: done with get_vars() 28285 1727204271.22074: done getting variables 28285 1727204271.22113: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get current device features] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:191 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.068) 0:00:12.031 ***** 28285 1727204271.22156: entering _queue_task() for managed-node1/command 28285 1727204271.22416: worker is 1 (out of 1 available) 28285 1727204271.22430: exiting _queue_task() for managed-node1/command 28285 1727204271.22443: done queuing things up, now waiting for results queue to drain 28285 1727204271.22444: waiting for pending results... 28285 1727204271.22613: running TaskExecutor() for managed-node1/TASK: Get current device features 28285 1727204271.22670: in run() - task 0affcd87-79f5-57a1-d976-000000000156 28285 1727204271.22683: variable 'ansible_search_path' from source: unknown 28285 1727204271.22712: calling self._execute() 28285 1727204271.22785: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.22789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.22795: variable 'omit' from source: magic vars 28285 1727204271.23112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.24790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.24833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.24868: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.24894: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.24914: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.24976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.24995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.25015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.25041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.25055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.25159: variable 'ansible_distribution' from source: facts 28285 1727204271.25162: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.25182: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.25185: when evaluation is False, skipping this task 28285 1727204271.25188: _execute() done 28285 1727204271.25190: dumping result to json 28285 1727204271.25192: done dumping result, returning 28285 1727204271.25199: done running TaskExecutor() for managed-node1/TASK: Get current device features [0affcd87-79f5-57a1-d976-000000000156] 28285 1727204271.25205: sending task result for task 0affcd87-79f5-57a1-d976-000000000156 28285 1727204271.25292: done sending task result for task 0affcd87-79f5-57a1-d976-000000000156 28285 1727204271.25295: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.25346: no more pending results, returning what we have 28285 1727204271.25349: results queue empty 28285 1727204271.25350: checking for any_errors_fatal 28285 1727204271.25352: done checking for any_errors_fatal 28285 1727204271.25353: checking for max_fail_percentage 28285 1727204271.25354: done checking for max_fail_percentage 28285 1727204271.25355: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.25356: done checking to see if all hosts have failed 28285 1727204271.25357: getting the remaining hosts for this loop 28285 1727204271.25358: done getting the remaining hosts for this loop 28285 1727204271.25362: getting the next task for host managed-node1 28285 1727204271.25370: done getting next task for host managed-node1 28285 1727204271.25372: ^ task is: TASK: ASSERT: The profile does not change the ethtool features 28285 1727204271.25374: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204271.25377: getting variables 28285 1727204271.25379: in VariableManager get_vars() 28285 1727204271.25435: Calling all_inventory to load vars for managed-node1 28285 1727204271.25442: Calling groups_inventory to load vars for managed-node1 28285 1727204271.25445: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.25454: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.25456: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.25459: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.25589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.25743: done with get_vars() 28285 1727204271.25751: done getting variables 28285 1727204271.25794: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [ASSERT: The profile does not change the ethtool features] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:196 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.036) 0:00:12.068 ***** 28285 1727204271.25815: entering _queue_task() for managed-node1/assert 28285 1727204271.26014: worker is 1 (out of 1 available) 28285 1727204271.26028: exiting _queue_task() for managed-node1/assert 28285 1727204271.26040: done queuing things up, now waiting for results queue to drain 28285 1727204271.26041: waiting for pending results... 28285 1727204271.26214: running TaskExecutor() for managed-node1/TASK: ASSERT: The profile does not change the ethtool features 28285 1727204271.26269: in run() - task 0affcd87-79f5-57a1-d976-000000000157 28285 1727204271.26286: variable 'ansible_search_path' from source: unknown 28285 1727204271.26314: calling self._execute() 28285 1727204271.26394: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.26398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.26407: variable 'omit' from source: magic vars 28285 1727204271.26724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.28328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.28385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.28413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.28441: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.28463: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.28520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.28541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.28562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.28590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.28600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.28702: variable 'ansible_distribution' from source: facts 28285 1727204271.28708: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.28724: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.28727: when evaluation is False, skipping this task 28285 1727204271.28730: _execute() done 28285 1727204271.28732: dumping result to json 28285 1727204271.28734: done dumping result, returning 28285 1727204271.28741: done running TaskExecutor() for managed-node1/TASK: ASSERT: The profile does not change the ethtool features [0affcd87-79f5-57a1-d976-000000000157] 28285 1727204271.28747: sending task result for task 0affcd87-79f5-57a1-d976-000000000157 28285 1727204271.28836: done sending task result for task 0affcd87-79f5-57a1-d976-000000000157 28285 1727204271.28838: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.28907: no more pending results, returning what we have 28285 1727204271.28910: results queue empty 28285 1727204271.28911: checking for any_errors_fatal 28285 1727204271.28916: done checking for any_errors_fatal 28285 1727204271.28916: checking for max_fail_percentage 28285 1727204271.28918: done checking for max_fail_percentage 28285 1727204271.28919: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.28920: done checking to see if all hosts have failed 28285 1727204271.28921: getting the remaining hosts for this loop 28285 1727204271.28922: done getting the remaining hosts for this loop 28285 1727204271.28926: getting the next task for host managed-node1 28285 1727204271.28936: done getting next task for host managed-node1 28285 1727204271.28941: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204271.28945: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.28974: getting variables 28285 1727204271.28976: in VariableManager get_vars() 28285 1727204271.29022: Calling all_inventory to load vars for managed-node1 28285 1727204271.29025: Calling groups_inventory to load vars for managed-node1 28285 1727204271.29027: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.29034: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.29035: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.29037: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.29155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.29286: done with get_vars() 28285 1727204271.29294: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.035) 0:00:12.103 ***** 28285 1727204271.29361: entering _queue_task() for managed-node1/include_tasks 28285 1727204271.29563: worker is 1 (out of 1 available) 28285 1727204271.29577: exiting _queue_task() for managed-node1/include_tasks 28285 1727204271.29589: done queuing things up, now waiting for results queue to drain 28285 1727204271.29591: waiting for pending results... 28285 1727204271.29754: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28285 1727204271.29848: in run() - task 0affcd87-79f5-57a1-d976-000000000160 28285 1727204271.29862: variable 'ansible_search_path' from source: unknown 28285 1727204271.29867: variable 'ansible_search_path' from source: unknown 28285 1727204271.29897: calling self._execute() 28285 1727204271.29966: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.29971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.29979: variable 'omit' from source: magic vars 28285 1727204271.30294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.31911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.31954: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.31985: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.32013: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.32034: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.32090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.32113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.32131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.32158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.32170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.32270: variable 'ansible_distribution' from source: facts 28285 1727204271.32276: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.32291: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.32294: when evaluation is False, skipping this task 28285 1727204271.32296: _execute() done 28285 1727204271.32299: dumping result to json 28285 1727204271.32301: done dumping result, returning 28285 1727204271.32309: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-57a1-d976-000000000160] 28285 1727204271.32315: sending task result for task 0affcd87-79f5-57a1-d976-000000000160 28285 1727204271.32404: done sending task result for task 0affcd87-79f5-57a1-d976-000000000160 28285 1727204271.32407: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.32474: no more pending results, returning what we have 28285 1727204271.32477: results queue empty 28285 1727204271.32478: checking for any_errors_fatal 28285 1727204271.32483: done checking for any_errors_fatal 28285 1727204271.32484: checking for max_fail_percentage 28285 1727204271.32485: done checking for max_fail_percentage 28285 1727204271.32486: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.32487: done checking to see if all hosts have failed 28285 1727204271.32487: getting the remaining hosts for this loop 28285 1727204271.32489: done getting the remaining hosts for this loop 28285 1727204271.32494: getting the next task for host managed-node1 28285 1727204271.32500: done getting next task for host managed-node1 28285 1727204271.32504: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204271.32509: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.32527: getting variables 28285 1727204271.32528: in VariableManager get_vars() 28285 1727204271.32584: Calling all_inventory to load vars for managed-node1 28285 1727204271.32587: Calling groups_inventory to load vars for managed-node1 28285 1727204271.32589: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.32596: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.32598: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.32600: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.32751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.32880: done with get_vars() 28285 1727204271.32888: done getting variables 28285 1727204271.32927: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.035) 0:00:12.139 ***** 28285 1727204271.32952: entering _queue_task() for managed-node1/debug 28285 1727204271.33152: worker is 1 (out of 1 available) 28285 1727204271.33167: exiting _queue_task() for managed-node1/debug 28285 1727204271.33180: done queuing things up, now waiting for results queue to drain 28285 1727204271.33181: waiting for pending results... 28285 1727204271.33347: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 28285 1727204271.33437: in run() - task 0affcd87-79f5-57a1-d976-000000000161 28285 1727204271.33447: variable 'ansible_search_path' from source: unknown 28285 1727204271.33451: variable 'ansible_search_path' from source: unknown 28285 1727204271.33483: calling self._execute() 28285 1727204271.33548: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.33555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.33563: variable 'omit' from source: magic vars 28285 1727204271.33875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.35505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.35558: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.35592: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.35617: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.35637: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.35701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.35721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.35738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.35768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.35779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.35880: variable 'ansible_distribution' from source: facts 28285 1727204271.35886: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.35902: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.35905: when evaluation is False, skipping this task 28285 1727204271.35908: _execute() done 28285 1727204271.35911: dumping result to json 28285 1727204271.35914: done dumping result, returning 28285 1727204271.35918: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-57a1-d976-000000000161] 28285 1727204271.35931: sending task result for task 0affcd87-79f5-57a1-d976-000000000161 28285 1727204271.36017: done sending task result for task 0affcd87-79f5-57a1-d976-000000000161 28285 1727204271.36020: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204271.36076: no more pending results, returning what we have 28285 1727204271.36079: results queue empty 28285 1727204271.36080: checking for any_errors_fatal 28285 1727204271.36085: done checking for any_errors_fatal 28285 1727204271.36086: checking for max_fail_percentage 28285 1727204271.36087: done checking for max_fail_percentage 28285 1727204271.36089: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.36089: done checking to see if all hosts have failed 28285 1727204271.36090: getting the remaining hosts for this loop 28285 1727204271.36091: done getting the remaining hosts for this loop 28285 1727204271.36095: getting the next task for host managed-node1 28285 1727204271.36102: done getting next task for host managed-node1 28285 1727204271.36106: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204271.36110: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.36130: getting variables 28285 1727204271.36136: in VariableManager get_vars() 28285 1727204271.36189: Calling all_inventory to load vars for managed-node1 28285 1727204271.36192: Calling groups_inventory to load vars for managed-node1 28285 1727204271.36194: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.36203: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.36205: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.36207: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.36326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.36456: done with get_vars() 28285 1727204271.36467: done getting variables 28285 1727204271.36509: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.035) 0:00:12.175 ***** 28285 1727204271.36532: entering _queue_task() for managed-node1/fail 28285 1727204271.36737: worker is 1 (out of 1 available) 28285 1727204271.36750: exiting _queue_task() for managed-node1/fail 28285 1727204271.36765: done queuing things up, now waiting for results queue to drain 28285 1727204271.36767: waiting for pending results... 28285 1727204271.36934: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28285 1727204271.37019: in run() - task 0affcd87-79f5-57a1-d976-000000000162 28285 1727204271.37031: variable 'ansible_search_path' from source: unknown 28285 1727204271.37035: variable 'ansible_search_path' from source: unknown 28285 1727204271.37065: calling self._execute() 28285 1727204271.37129: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.37132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.37141: variable 'omit' from source: magic vars 28285 1727204271.37458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.39130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.39181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.39215: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.39243: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.39267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.39332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.39354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.39373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.39402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.39417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.39524: variable 'ansible_distribution' from source: facts 28285 1727204271.39530: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.39545: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.39548: when evaluation is False, skipping this task 28285 1727204271.39553: _execute() done 28285 1727204271.39556: dumping result to json 28285 1727204271.39559: done dumping result, returning 28285 1727204271.39572: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-57a1-d976-000000000162] 28285 1727204271.39580: sending task result for task 0affcd87-79f5-57a1-d976-000000000162 28285 1727204271.39673: done sending task result for task 0affcd87-79f5-57a1-d976-000000000162 28285 1727204271.39676: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.39744: no more pending results, returning what we have 28285 1727204271.39748: results queue empty 28285 1727204271.39748: checking for any_errors_fatal 28285 1727204271.39756: done checking for any_errors_fatal 28285 1727204271.39756: checking for max_fail_percentage 28285 1727204271.39758: done checking for max_fail_percentage 28285 1727204271.39759: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.39760: done checking to see if all hosts have failed 28285 1727204271.39761: getting the remaining hosts for this loop 28285 1727204271.39762: done getting the remaining hosts for this loop 28285 1727204271.39768: getting the next task for host managed-node1 28285 1727204271.39774: done getting next task for host managed-node1 28285 1727204271.39778: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204271.39782: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.39807: getting variables 28285 1727204271.39809: in VariableManager get_vars() 28285 1727204271.39854: Calling all_inventory to load vars for managed-node1 28285 1727204271.39857: Calling groups_inventory to load vars for managed-node1 28285 1727204271.39859: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.39868: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.39871: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.39873: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.40033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.40157: done with get_vars() 28285 1727204271.40167: done getting variables 28285 1727204271.40207: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.036) 0:00:12.212 ***** 28285 1727204271.40232: entering _queue_task() for managed-node1/fail 28285 1727204271.40437: worker is 1 (out of 1 available) 28285 1727204271.40449: exiting _queue_task() for managed-node1/fail 28285 1727204271.40461: done queuing things up, now waiting for results queue to drain 28285 1727204271.40465: waiting for pending results... 28285 1727204271.40636: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28285 1727204271.40727: in run() - task 0affcd87-79f5-57a1-d976-000000000163 28285 1727204271.40736: variable 'ansible_search_path' from source: unknown 28285 1727204271.40740: variable 'ansible_search_path' from source: unknown 28285 1727204271.40770: calling self._execute() 28285 1727204271.40841: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.40845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.40856: variable 'omit' from source: magic vars 28285 1727204271.41170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.42799: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.42856: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.42885: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.42912: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.42931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.42992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.43012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.43029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.43063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.43076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.43176: variable 'ansible_distribution' from source: facts 28285 1727204271.43183: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.43198: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.43201: when evaluation is False, skipping this task 28285 1727204271.43203: _execute() done 28285 1727204271.43206: dumping result to json 28285 1727204271.43208: done dumping result, returning 28285 1727204271.43216: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-57a1-d976-000000000163] 28285 1727204271.43221: sending task result for task 0affcd87-79f5-57a1-d976-000000000163 28285 1727204271.43311: done sending task result for task 0affcd87-79f5-57a1-d976-000000000163 28285 1727204271.43313: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.43359: no more pending results, returning what we have 28285 1727204271.43363: results queue empty 28285 1727204271.43365: checking for any_errors_fatal 28285 1727204271.43372: done checking for any_errors_fatal 28285 1727204271.43373: checking for max_fail_percentage 28285 1727204271.43374: done checking for max_fail_percentage 28285 1727204271.43375: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.43376: done checking to see if all hosts have failed 28285 1727204271.43377: getting the remaining hosts for this loop 28285 1727204271.43378: done getting the remaining hosts for this loop 28285 1727204271.43382: getting the next task for host managed-node1 28285 1727204271.43389: done getting next task for host managed-node1 28285 1727204271.43393: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204271.43397: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.43417: getting variables 28285 1727204271.43418: in VariableManager get_vars() 28285 1727204271.43476: Calling all_inventory to load vars for managed-node1 28285 1727204271.43479: Calling groups_inventory to load vars for managed-node1 28285 1727204271.43480: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.43489: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.43491: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.43494: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.43617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.43743: done with get_vars() 28285 1727204271.43754: done getting variables 28285 1727204271.43797: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.035) 0:00:12.248 ***** 28285 1727204271.43821: entering _queue_task() for managed-node1/fail 28285 1727204271.44028: worker is 1 (out of 1 available) 28285 1727204271.44041: exiting _queue_task() for managed-node1/fail 28285 1727204271.44052: done queuing things up, now waiting for results queue to drain 28285 1727204271.44054: waiting for pending results... 28285 1727204271.44228: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28285 1727204271.44320: in run() - task 0affcd87-79f5-57a1-d976-000000000164 28285 1727204271.44331: variable 'ansible_search_path' from source: unknown 28285 1727204271.44334: variable 'ansible_search_path' from source: unknown 28285 1727204271.44366: calling self._execute() 28285 1727204271.44432: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.44437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.44445: variable 'omit' from source: magic vars 28285 1727204271.44756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.46483: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.46529: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.46560: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.46590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.46610: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.46669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.46744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.46775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.46856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.46878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.47048: variable 'ansible_distribution' from source: facts 28285 1727204271.47061: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.47086: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.47094: when evaluation is False, skipping this task 28285 1727204271.47100: _execute() done 28285 1727204271.47107: dumping result to json 28285 1727204271.47115: done dumping result, returning 28285 1727204271.47127: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-57a1-d976-000000000164] 28285 1727204271.47139: sending task result for task 0affcd87-79f5-57a1-d976-000000000164 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.47316: no more pending results, returning what we have 28285 1727204271.47319: results queue empty 28285 1727204271.47320: checking for any_errors_fatal 28285 1727204271.47327: done checking for any_errors_fatal 28285 1727204271.47328: checking for max_fail_percentage 28285 1727204271.47330: done checking for max_fail_percentage 28285 1727204271.47331: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.47331: done checking to see if all hosts have failed 28285 1727204271.47332: getting the remaining hosts for this loop 28285 1727204271.47334: done getting the remaining hosts for this loop 28285 1727204271.47338: getting the next task for host managed-node1 28285 1727204271.47346: done getting next task for host managed-node1 28285 1727204271.47350: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204271.47354: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.47383: getting variables 28285 1727204271.47385: in VariableManager get_vars() 28285 1727204271.47441: Calling all_inventory to load vars for managed-node1 28285 1727204271.47444: Calling groups_inventory to load vars for managed-node1 28285 1727204271.47446: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.47458: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.47460: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.47465: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.47774: done sending task result for task 0affcd87-79f5-57a1-d976-000000000164 28285 1727204271.47778: WORKER PROCESS EXITING 28285 1727204271.47843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.48146: done with get_vars() 28285 1727204271.48159: done getting variables 28285 1727204271.48216: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.044) 0:00:12.292 ***** 28285 1727204271.48262: entering _queue_task() for managed-node1/dnf 28285 1727204271.48470: worker is 1 (out of 1 available) 28285 1727204271.48483: exiting _queue_task() for managed-node1/dnf 28285 1727204271.48496: done queuing things up, now waiting for results queue to drain 28285 1727204271.48498: waiting for pending results... 28285 1727204271.48663: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28285 1727204271.48753: in run() - task 0affcd87-79f5-57a1-d976-000000000165 28285 1727204271.48762: variable 'ansible_search_path' from source: unknown 28285 1727204271.48768: variable 'ansible_search_path' from source: unknown 28285 1727204271.48798: calling self._execute() 28285 1727204271.48868: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.48872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.48880: variable 'omit' from source: magic vars 28285 1727204271.49191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.51359: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.51442: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.51494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.51533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.51576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.51657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.51698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.51727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.51778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.51803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.51960: variable 'ansible_distribution' from source: facts 28285 1727204271.51975: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.52000: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.52012: when evaluation is False, skipping this task 28285 1727204271.52018: _execute() done 28285 1727204271.52024: dumping result to json 28285 1727204271.52033: done dumping result, returning 28285 1727204271.52047: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000165] 28285 1727204271.52063: sending task result for task 0affcd87-79f5-57a1-d976-000000000165 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.52236: no more pending results, returning what we have 28285 1727204271.52239: results queue empty 28285 1727204271.52240: checking for any_errors_fatal 28285 1727204271.52246: done checking for any_errors_fatal 28285 1727204271.52247: checking for max_fail_percentage 28285 1727204271.52252: done checking for max_fail_percentage 28285 1727204271.52253: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.52253: done checking to see if all hosts have failed 28285 1727204271.52254: getting the remaining hosts for this loop 28285 1727204271.52256: done getting the remaining hosts for this loop 28285 1727204271.52260: getting the next task for host managed-node1 28285 1727204271.52270: done getting next task for host managed-node1 28285 1727204271.52274: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204271.52279: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.52303: getting variables 28285 1727204271.52305: in VariableManager get_vars() 28285 1727204271.52362: Calling all_inventory to load vars for managed-node1 28285 1727204271.52367: Calling groups_inventory to load vars for managed-node1 28285 1727204271.52370: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.52382: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.52385: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.52388: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.52577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.52811: done with get_vars() 28285 1727204271.52824: done getting variables 28285 1727204271.53004: done sending task result for task 0affcd87-79f5-57a1-d976-000000000165 28285 1727204271.53007: WORKER PROCESS EXITING redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28285 1727204271.53033: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.048) 0:00:12.340 ***** 28285 1727204271.53076: entering _queue_task() for managed-node1/yum 28285 1727204271.53546: worker is 1 (out of 1 available) 28285 1727204271.53560: exiting _queue_task() for managed-node1/yum 28285 1727204271.53575: done queuing things up, now waiting for results queue to drain 28285 1727204271.53576: waiting for pending results... 28285 1727204271.53856: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28285 1727204271.54015: in run() - task 0affcd87-79f5-57a1-d976-000000000166 28285 1727204271.54037: variable 'ansible_search_path' from source: unknown 28285 1727204271.54044: variable 'ansible_search_path' from source: unknown 28285 1727204271.54093: calling self._execute() 28285 1727204271.54193: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.54205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.54218: variable 'omit' from source: magic vars 28285 1727204271.54665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.57378: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.57462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.57510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.57556: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.57591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.57687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.57725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.57760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.57813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.57837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.58003: variable 'ansible_distribution' from source: facts 28285 1727204271.58016: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.58046: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.58057: when evaluation is False, skipping this task 28285 1727204271.58066: _execute() done 28285 1727204271.58074: dumping result to json 28285 1727204271.58082: done dumping result, returning 28285 1727204271.58098: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000166] 28285 1727204271.58109: sending task result for task 0affcd87-79f5-57a1-d976-000000000166 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.58280: no more pending results, returning what we have 28285 1727204271.58284: results queue empty 28285 1727204271.58285: checking for any_errors_fatal 28285 1727204271.58291: done checking for any_errors_fatal 28285 1727204271.58292: checking for max_fail_percentage 28285 1727204271.58294: done checking for max_fail_percentage 28285 1727204271.58295: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.58295: done checking to see if all hosts have failed 28285 1727204271.58296: getting the remaining hosts for this loop 28285 1727204271.58298: done getting the remaining hosts for this loop 28285 1727204271.58302: getting the next task for host managed-node1 28285 1727204271.58310: done getting next task for host managed-node1 28285 1727204271.58314: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204271.58319: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.58342: getting variables 28285 1727204271.58344: in VariableManager get_vars() 28285 1727204271.58406: Calling all_inventory to load vars for managed-node1 28285 1727204271.58409: Calling groups_inventory to load vars for managed-node1 28285 1727204271.58412: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.58423: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.58427: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.58430: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.58660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.59000: done with get_vars() 28285 1727204271.59012: done getting variables 28285 1727204271.59044: done sending task result for task 0affcd87-79f5-57a1-d976-000000000166 28285 1727204271.59047: WORKER PROCESS EXITING 28285 1727204271.59086: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.061) 0:00:12.402 ***** 28285 1727204271.59236: entering _queue_task() for managed-node1/fail 28285 1727204271.59603: worker is 1 (out of 1 available) 28285 1727204271.59615: exiting _queue_task() for managed-node1/fail 28285 1727204271.59630: done queuing things up, now waiting for results queue to drain 28285 1727204271.59632: waiting for pending results... 28285 1727204271.59924: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28285 1727204271.60087: in run() - task 0affcd87-79f5-57a1-d976-000000000167 28285 1727204271.60107: variable 'ansible_search_path' from source: unknown 28285 1727204271.60115: variable 'ansible_search_path' from source: unknown 28285 1727204271.60158: calling self._execute() 28285 1727204271.60256: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.60270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.60289: variable 'omit' from source: magic vars 28285 1727204271.60778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.63391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.63784: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.63827: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.63873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.63908: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.63998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.64032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.64067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.64120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.64139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.64296: variable 'ansible_distribution' from source: facts 28285 1727204271.64313: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.64335: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.64342: when evaluation is False, skipping this task 28285 1727204271.64350: _execute() done 28285 1727204271.64357: dumping result to json 28285 1727204271.64365: done dumping result, returning 28285 1727204271.64377: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-000000000167] 28285 1727204271.64386: sending task result for task 0affcd87-79f5-57a1-d976-000000000167 28285 1727204271.64506: done sending task result for task 0affcd87-79f5-57a1-d976-000000000167 28285 1727204271.64514: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.64569: no more pending results, returning what we have 28285 1727204271.64573: results queue empty 28285 1727204271.64574: checking for any_errors_fatal 28285 1727204271.64581: done checking for any_errors_fatal 28285 1727204271.64582: checking for max_fail_percentage 28285 1727204271.64584: done checking for max_fail_percentage 28285 1727204271.64585: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.64586: done checking to see if all hosts have failed 28285 1727204271.64586: getting the remaining hosts for this loop 28285 1727204271.64588: done getting the remaining hosts for this loop 28285 1727204271.64592: getting the next task for host managed-node1 28285 1727204271.64599: done getting next task for host managed-node1 28285 1727204271.64604: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28285 1727204271.64608: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.64630: getting variables 28285 1727204271.64632: in VariableManager get_vars() 28285 1727204271.64687: Calling all_inventory to load vars for managed-node1 28285 1727204271.64690: Calling groups_inventory to load vars for managed-node1 28285 1727204271.64693: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.64706: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.64709: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.64711: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.64895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.65124: done with get_vars() 28285 1727204271.65136: done getting variables 28285 1727204271.65323: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.061) 0:00:12.463 ***** 28285 1727204271.65362: entering _queue_task() for managed-node1/package 28285 1727204271.65847: worker is 1 (out of 1 available) 28285 1727204271.65863: exiting _queue_task() for managed-node1/package 28285 1727204271.65878: done queuing things up, now waiting for results queue to drain 28285 1727204271.65880: waiting for pending results... 28285 1727204271.66189: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 28285 1727204271.66347: in run() - task 0affcd87-79f5-57a1-d976-000000000168 28285 1727204271.66376: variable 'ansible_search_path' from source: unknown 28285 1727204271.66388: variable 'ansible_search_path' from source: unknown 28285 1727204271.66437: calling self._execute() 28285 1727204271.66534: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.66554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.66571: variable 'omit' from source: magic vars 28285 1727204271.67067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.70107: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.70193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.70242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.70287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.70321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.70416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.70455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.70489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.70540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.70562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.70724: variable 'ansible_distribution' from source: facts 28285 1727204271.70740: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.70768: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.70776: when evaluation is False, skipping this task 28285 1727204271.70782: _execute() done 28285 1727204271.70791: dumping result to json 28285 1727204271.70798: done dumping result, returning 28285 1727204271.70809: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-57a1-d976-000000000168] 28285 1727204271.70819: sending task result for task 0affcd87-79f5-57a1-d976-000000000168 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.70985: no more pending results, returning what we have 28285 1727204271.70989: results queue empty 28285 1727204271.70990: checking for any_errors_fatal 28285 1727204271.70996: done checking for any_errors_fatal 28285 1727204271.70997: checking for max_fail_percentage 28285 1727204271.70998: done checking for max_fail_percentage 28285 1727204271.70999: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.71000: done checking to see if all hosts have failed 28285 1727204271.71001: getting the remaining hosts for this loop 28285 1727204271.71003: done getting the remaining hosts for this loop 28285 1727204271.71007: getting the next task for host managed-node1 28285 1727204271.71014: done getting next task for host managed-node1 28285 1727204271.71019: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204271.71023: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.71046: getting variables 28285 1727204271.71047: in VariableManager get_vars() 28285 1727204271.71106: Calling all_inventory to load vars for managed-node1 28285 1727204271.71109: Calling groups_inventory to load vars for managed-node1 28285 1727204271.71111: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.71122: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.71125: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.71128: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.71361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.71684: done with get_vars() 28285 1727204271.71701: done getting variables 28285 1727204271.71807: done sending task result for task 0affcd87-79f5-57a1-d976-000000000168 28285 1727204271.71811: WORKER PROCESS EXITING 28285 1727204271.71852: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.065) 0:00:12.528 ***** 28285 1727204271.71893: entering _queue_task() for managed-node1/package 28285 1727204271.72347: worker is 1 (out of 1 available) 28285 1727204271.72366: exiting _queue_task() for managed-node1/package 28285 1727204271.72380: done queuing things up, now waiting for results queue to drain 28285 1727204271.72381: waiting for pending results... 28285 1727204271.72667: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28285 1727204271.72822: in run() - task 0affcd87-79f5-57a1-d976-000000000169 28285 1727204271.72850: variable 'ansible_search_path' from source: unknown 28285 1727204271.72860: variable 'ansible_search_path' from source: unknown 28285 1727204271.72910: calling self._execute() 28285 1727204271.73020: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.73032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.73055: variable 'omit' from source: magic vars 28285 1727204271.73559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.76545: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.76629: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.76676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.76737: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.76775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.76868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.76902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.76938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.76993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.77013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.77180: variable 'ansible_distribution' from source: facts 28285 1727204271.77194: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.77216: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.77223: when evaluation is False, skipping this task 28285 1727204271.77231: _execute() done 28285 1727204271.77241: dumping result to json 28285 1727204271.77250: done dumping result, returning 28285 1727204271.77262: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-57a1-d976-000000000169] 28285 1727204271.77276: sending task result for task 0affcd87-79f5-57a1-d976-000000000169 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.77436: no more pending results, returning what we have 28285 1727204271.77439: results queue empty 28285 1727204271.77440: checking for any_errors_fatal 28285 1727204271.77447: done checking for any_errors_fatal 28285 1727204271.77451: checking for max_fail_percentage 28285 1727204271.77452: done checking for max_fail_percentage 28285 1727204271.77454: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.77454: done checking to see if all hosts have failed 28285 1727204271.77455: getting the remaining hosts for this loop 28285 1727204271.77457: done getting the remaining hosts for this loop 28285 1727204271.77461: getting the next task for host managed-node1 28285 1727204271.77471: done getting next task for host managed-node1 28285 1727204271.77475: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204271.77479: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.77502: getting variables 28285 1727204271.77504: in VariableManager get_vars() 28285 1727204271.77559: Calling all_inventory to load vars for managed-node1 28285 1727204271.77562: Calling groups_inventory to load vars for managed-node1 28285 1727204271.77566: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.77578: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.77580: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.77583: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.77765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.77997: done with get_vars() 28285 1727204271.78009: done getting variables 28285 1727204271.78076: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204271.78308: done sending task result for task 0affcd87-79f5-57a1-d976-000000000169 28285 1727204271.78312: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.064) 0:00:12.593 ***** 28285 1727204271.78327: entering _queue_task() for managed-node1/package 28285 1727204271.78722: worker is 1 (out of 1 available) 28285 1727204271.78738: exiting _queue_task() for managed-node1/package 28285 1727204271.78754: done queuing things up, now waiting for results queue to drain 28285 1727204271.78755: waiting for pending results... 28285 1727204271.79037: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28285 1727204271.79202: in run() - task 0affcd87-79f5-57a1-d976-00000000016a 28285 1727204271.79223: variable 'ansible_search_path' from source: unknown 28285 1727204271.79234: variable 'ansible_search_path' from source: unknown 28285 1727204271.79283: calling self._execute() 28285 1727204271.79381: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.79399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.79419: variable 'omit' from source: magic vars 28285 1727204271.79883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.82897: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.82978: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.83028: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.83073: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.83114: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.83201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.83243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.83283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.83337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.83361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.83507: variable 'ansible_distribution' from source: facts 28285 1727204271.83518: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.83546: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.83556: when evaluation is False, skipping this task 28285 1727204271.83561: _execute() done 28285 1727204271.83568: dumping result to json 28285 1727204271.83574: done dumping result, returning 28285 1727204271.83584: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-57a1-d976-00000000016a] 28285 1727204271.83593: sending task result for task 0affcd87-79f5-57a1-d976-00000000016a skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.83744: no more pending results, returning what we have 28285 1727204271.83747: results queue empty 28285 1727204271.83750: checking for any_errors_fatal 28285 1727204271.83757: done checking for any_errors_fatal 28285 1727204271.83758: checking for max_fail_percentage 28285 1727204271.83760: done checking for max_fail_percentage 28285 1727204271.83761: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.83762: done checking to see if all hosts have failed 28285 1727204271.83763: getting the remaining hosts for this loop 28285 1727204271.83766: done getting the remaining hosts for this loop 28285 1727204271.83770: getting the next task for host managed-node1 28285 1727204271.83779: done getting next task for host managed-node1 28285 1727204271.83784: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204271.83788: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.83809: getting variables 28285 1727204271.83811: in VariableManager get_vars() 28285 1727204271.83867: Calling all_inventory to load vars for managed-node1 28285 1727204271.83870: Calling groups_inventory to load vars for managed-node1 28285 1727204271.83872: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.83883: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.83886: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.83889: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.84121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.84333: done with get_vars() 28285 1727204271.84345: done getting variables 28285 1727204271.84507: done sending task result for task 0affcd87-79f5-57a1-d976-00000000016a 28285 1727204271.84511: WORKER PROCESS EXITING 28285 1727204271.84539: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.062) 0:00:12.655 ***** 28285 1727204271.84585: entering _queue_task() for managed-node1/service 28285 1727204271.85090: worker is 1 (out of 1 available) 28285 1727204271.85102: exiting _queue_task() for managed-node1/service 28285 1727204271.85114: done queuing things up, now waiting for results queue to drain 28285 1727204271.85116: waiting for pending results... 28285 1727204271.85406: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28285 1727204271.85565: in run() - task 0affcd87-79f5-57a1-d976-00000000016b 28285 1727204271.85586: variable 'ansible_search_path' from source: unknown 28285 1727204271.85598: variable 'ansible_search_path' from source: unknown 28285 1727204271.85638: calling self._execute() 28285 1727204271.85737: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.85752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.85772: variable 'omit' from source: magic vars 28285 1727204271.86262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.89219: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.89306: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.89373: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.89417: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.89454: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.89542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.89585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.89622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.89676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.89698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.89867: variable 'ansible_distribution' from source: facts 28285 1727204271.89879: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.89906: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.89914: when evaluation is False, skipping this task 28285 1727204271.89922: _execute() done 28285 1727204271.89931: dumping result to json 28285 1727204271.89938: done dumping result, returning 28285 1727204271.89952: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-57a1-d976-00000000016b] 28285 1727204271.89962: sending task result for task 0affcd87-79f5-57a1-d976-00000000016b skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204271.90121: no more pending results, returning what we have 28285 1727204271.90125: results queue empty 28285 1727204271.90126: checking for any_errors_fatal 28285 1727204271.90131: done checking for any_errors_fatal 28285 1727204271.90132: checking for max_fail_percentage 28285 1727204271.90134: done checking for max_fail_percentage 28285 1727204271.90135: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.90136: done checking to see if all hosts have failed 28285 1727204271.90137: getting the remaining hosts for this loop 28285 1727204271.90138: done getting the remaining hosts for this loop 28285 1727204271.90143: getting the next task for host managed-node1 28285 1727204271.90153: done getting next task for host managed-node1 28285 1727204271.90157: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204271.90163: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.90185: getting variables 28285 1727204271.90187: in VariableManager get_vars() 28285 1727204271.90241: Calling all_inventory to load vars for managed-node1 28285 1727204271.90244: Calling groups_inventory to load vars for managed-node1 28285 1727204271.90247: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.90261: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.90264: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.90268: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.90453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.90686: done with get_vars() 28285 1727204271.90698: done getting variables 28285 1727204271.90762: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 28285 1727204271.90970: done sending task result for task 0affcd87-79f5-57a1-d976-00000000016b 28285 1727204271.90973: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.064) 0:00:12.719 ***** 28285 1727204271.90989: entering _queue_task() for managed-node1/service 28285 1727204271.91410: worker is 1 (out of 1 available) 28285 1727204271.91423: exiting _queue_task() for managed-node1/service 28285 1727204271.91440: done queuing things up, now waiting for results queue to drain 28285 1727204271.91441: waiting for pending results... 28285 1727204271.91727: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28285 1727204271.91878: in run() - task 0affcd87-79f5-57a1-d976-00000000016c 28285 1727204271.91899: variable 'ansible_search_path' from source: unknown 28285 1727204271.91906: variable 'ansible_search_path' from source: unknown 28285 1727204271.91942: calling self._execute() 28285 1727204271.92037: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.92051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.92068: variable 'omit' from source: magic vars 28285 1727204271.92538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.95108: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.95161: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.95192: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.95221: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.95246: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.95306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.95327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.95353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.95378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.95388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204271.95493: variable 'ansible_distribution' from source: facts 28285 1727204271.95499: variable 'ansible_distribution_major_version' from source: facts 28285 1727204271.95514: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204271.95517: when evaluation is False, skipping this task 28285 1727204271.95520: _execute() done 28285 1727204271.95522: dumping result to json 28285 1727204271.95524: done dumping result, returning 28285 1727204271.95531: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-57a1-d976-00000000016c] 28285 1727204271.95539: sending task result for task 0affcd87-79f5-57a1-d976-00000000016c 28285 1727204271.95631: done sending task result for task 0affcd87-79f5-57a1-d976-00000000016c 28285 1727204271.95634: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204271.95716: no more pending results, returning what we have 28285 1727204271.95720: results queue empty 28285 1727204271.95721: checking for any_errors_fatal 28285 1727204271.95727: done checking for any_errors_fatal 28285 1727204271.95727: checking for max_fail_percentage 28285 1727204271.95729: done checking for max_fail_percentage 28285 1727204271.95730: checking to see if all hosts have failed and the running result is not ok 28285 1727204271.95731: done checking to see if all hosts have failed 28285 1727204271.95731: getting the remaining hosts for this loop 28285 1727204271.95733: done getting the remaining hosts for this loop 28285 1727204271.95737: getting the next task for host managed-node1 28285 1727204271.95743: done getting next task for host managed-node1 28285 1727204271.95747: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204271.95755: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204271.95776: getting variables 28285 1727204271.95778: in VariableManager get_vars() 28285 1727204271.95824: Calling all_inventory to load vars for managed-node1 28285 1727204271.95827: Calling groups_inventory to load vars for managed-node1 28285 1727204271.95829: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204271.95837: Calling all_plugins_play to load vars for managed-node1 28285 1727204271.95838: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204271.95840: Calling groups_plugins_play to load vars for managed-node1 28285 1727204271.96180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204271.96359: done with get_vars() 28285 1727204271.96371: done getting variables 28285 1727204271.96431: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.054) 0:00:12.774 ***** 28285 1727204271.96469: entering _queue_task() for managed-node1/service 28285 1727204271.96753: worker is 1 (out of 1 available) 28285 1727204271.96765: exiting _queue_task() for managed-node1/service 28285 1727204271.96777: done queuing things up, now waiting for results queue to drain 28285 1727204271.96779: waiting for pending results... 28285 1727204271.97058: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28285 1727204271.97224: in run() - task 0affcd87-79f5-57a1-d976-00000000016d 28285 1727204271.97245: variable 'ansible_search_path' from source: unknown 28285 1727204271.97257: variable 'ansible_search_path' from source: unknown 28285 1727204271.97300: calling self._execute() 28285 1727204271.97392: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204271.97404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204271.97418: variable 'omit' from source: magic vars 28285 1727204271.97814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204271.99497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204271.99567: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204271.99622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204271.99665: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204271.99696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204271.99786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204271.99821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204271.99860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204271.99936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204271.99961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.00116: variable 'ansible_distribution' from source: facts 28285 1727204272.00128: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.00154: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.00167: when evaluation is False, skipping this task 28285 1727204272.00175: _execute() done 28285 1727204272.00182: dumping result to json 28285 1727204272.00189: done dumping result, returning 28285 1727204272.00200: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-57a1-d976-00000000016d] 28285 1727204272.00211: sending task result for task 0affcd87-79f5-57a1-d976-00000000016d skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204272.00365: no more pending results, returning what we have 28285 1727204272.00370: results queue empty 28285 1727204272.00371: checking for any_errors_fatal 28285 1727204272.00379: done checking for any_errors_fatal 28285 1727204272.00379: checking for max_fail_percentage 28285 1727204272.00381: done checking for max_fail_percentage 28285 1727204272.00382: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.00383: done checking to see if all hosts have failed 28285 1727204272.00384: getting the remaining hosts for this loop 28285 1727204272.00386: done getting the remaining hosts for this loop 28285 1727204272.00390: getting the next task for host managed-node1 28285 1727204272.00399: done getting next task for host managed-node1 28285 1727204272.00404: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204272.00409: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.00432: getting variables 28285 1727204272.00434: in VariableManager get_vars() 28285 1727204272.00496: Calling all_inventory to load vars for managed-node1 28285 1727204272.00499: Calling groups_inventory to load vars for managed-node1 28285 1727204272.00502: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.00514: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.00516: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.00519: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.00728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.00947: done with get_vars() 28285 1727204272.00963: done getting variables 28285 1727204272.01300: done sending task result for task 0affcd87-79f5-57a1-d976-00000000016d 28285 1727204272.01303: WORKER PROCESS EXITING 28285 1727204272.01338: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.049) 0:00:12.823 ***** 28285 1727204272.01378: entering _queue_task() for managed-node1/service 28285 1727204272.01636: worker is 1 (out of 1 available) 28285 1727204272.01651: exiting _queue_task() for managed-node1/service 28285 1727204272.01665: done queuing things up, now waiting for results queue to drain 28285 1727204272.01667: waiting for pending results... 28285 1727204272.02432: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 28285 1727204272.02845: in run() - task 0affcd87-79f5-57a1-d976-00000000016e 28285 1727204272.02871: variable 'ansible_search_path' from source: unknown 28285 1727204272.02879: variable 'ansible_search_path' from source: unknown 28285 1727204272.02922: calling self._execute() 28285 1727204272.03015: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.03026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.03040: variable 'omit' from source: magic vars 28285 1727204272.04283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.08106: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.08188: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.08232: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.08276: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.08308: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.08395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.08431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.08470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.08518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.08538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.08693: variable 'ansible_distribution' from source: facts 28285 1727204272.08706: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.08729: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.08737: when evaluation is False, skipping this task 28285 1727204272.08744: _execute() done 28285 1727204272.08754: dumping result to json 28285 1727204272.08762: done dumping result, returning 28285 1727204272.08778: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-57a1-d976-00000000016e] 28285 1727204272.08789: sending task result for task 0affcd87-79f5-57a1-d976-00000000016e skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28285 1727204272.08934: no more pending results, returning what we have 28285 1727204272.08937: results queue empty 28285 1727204272.08938: checking for any_errors_fatal 28285 1727204272.08942: done checking for any_errors_fatal 28285 1727204272.08943: checking for max_fail_percentage 28285 1727204272.08945: done checking for max_fail_percentage 28285 1727204272.08946: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.08947: done checking to see if all hosts have failed 28285 1727204272.08947: getting the remaining hosts for this loop 28285 1727204272.08949: done getting the remaining hosts for this loop 28285 1727204272.08953: getting the next task for host managed-node1 28285 1727204272.08959: done getting next task for host managed-node1 28285 1727204272.08965: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204272.08970: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.08991: getting variables 28285 1727204272.08993: in VariableManager get_vars() 28285 1727204272.09043: Calling all_inventory to load vars for managed-node1 28285 1727204272.09045: Calling groups_inventory to load vars for managed-node1 28285 1727204272.09047: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.09057: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.09059: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.09062: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.09221: done sending task result for task 0affcd87-79f5-57a1-d976-00000000016e 28285 1727204272.09224: WORKER PROCESS EXITING 28285 1727204272.09319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.09545: done with get_vars() 28285 1727204272.09557: done getting variables 28285 1727204272.09626: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.082) 0:00:12.906 ***** 28285 1727204272.09662: entering _queue_task() for managed-node1/copy 28285 1727204272.09954: worker is 1 (out of 1 available) 28285 1727204272.09968: exiting _queue_task() for managed-node1/copy 28285 1727204272.09980: done queuing things up, now waiting for results queue to drain 28285 1727204272.09982: waiting for pending results... 28285 1727204272.10282: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28285 1727204272.10440: in run() - task 0affcd87-79f5-57a1-d976-00000000016f 28285 1727204272.10459: variable 'ansible_search_path' from source: unknown 28285 1727204272.10470: variable 'ansible_search_path' from source: unknown 28285 1727204272.10520: calling self._execute() 28285 1727204272.10623: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.10634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.10652: variable 'omit' from source: magic vars 28285 1727204272.11131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.13691: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.13791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.13838: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.13889: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.13923: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.14017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.14050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.14091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.14143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.14165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.14347: variable 'ansible_distribution' from source: facts 28285 1727204272.14361: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.14393: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.14407: when evaluation is False, skipping this task 28285 1727204272.14415: _execute() done 28285 1727204272.14422: dumping result to json 28285 1727204272.14432: done dumping result, returning 28285 1727204272.14449: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-57a1-d976-00000000016f] 28285 1727204272.14461: sending task result for task 0affcd87-79f5-57a1-d976-00000000016f skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204272.14628: no more pending results, returning what we have 28285 1727204272.14632: results queue empty 28285 1727204272.14632: checking for any_errors_fatal 28285 1727204272.14639: done checking for any_errors_fatal 28285 1727204272.14640: checking for max_fail_percentage 28285 1727204272.14642: done checking for max_fail_percentage 28285 1727204272.14643: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.14644: done checking to see if all hosts have failed 28285 1727204272.14645: getting the remaining hosts for this loop 28285 1727204272.14652: done getting the remaining hosts for this loop 28285 1727204272.14656: getting the next task for host managed-node1 28285 1727204272.14665: done getting next task for host managed-node1 28285 1727204272.14669: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204272.14674: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.14695: getting variables 28285 1727204272.14697: in VariableManager get_vars() 28285 1727204272.14755: Calling all_inventory to load vars for managed-node1 28285 1727204272.14758: Calling groups_inventory to load vars for managed-node1 28285 1727204272.14761: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.14775: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.14778: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.14781: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.14984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.15237: done with get_vars() 28285 1727204272.15251: done getting variables 28285 1727204272.15419: done sending task result for task 0affcd87-79f5-57a1-d976-00000000016f 28285 1727204272.15423: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.058) 0:00:12.964 ***** 28285 1727204272.15491: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204272.16033: worker is 1 (out of 1 available) 28285 1727204272.16046: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 28285 1727204272.16057: done queuing things up, now waiting for results queue to drain 28285 1727204272.16059: waiting for pending results... 28285 1727204272.16343: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28285 1727204272.16482: in run() - task 0affcd87-79f5-57a1-d976-000000000170 28285 1727204272.16503: variable 'ansible_search_path' from source: unknown 28285 1727204272.16518: variable 'ansible_search_path' from source: unknown 28285 1727204272.16561: calling self._execute() 28285 1727204272.16666: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.16678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.16692: variable 'omit' from source: magic vars 28285 1727204272.17145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.19656: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.19750: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.19813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.19858: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.19907: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.19998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.20045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.20082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.20136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.20155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.20311: variable 'ansible_distribution' from source: facts 28285 1727204272.20323: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.20356: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.20366: when evaluation is False, skipping this task 28285 1727204272.20373: _execute() done 28285 1727204272.20380: dumping result to json 28285 1727204272.20387: done dumping result, returning 28285 1727204272.20398: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-57a1-d976-000000000170] 28285 1727204272.20409: sending task result for task 0affcd87-79f5-57a1-d976-000000000170 28285 1727204272.20537: done sending task result for task 0affcd87-79f5-57a1-d976-000000000170 28285 1727204272.20548: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204272.20605: no more pending results, returning what we have 28285 1727204272.20608: results queue empty 28285 1727204272.20609: checking for any_errors_fatal 28285 1727204272.20617: done checking for any_errors_fatal 28285 1727204272.20617: checking for max_fail_percentage 28285 1727204272.20619: done checking for max_fail_percentage 28285 1727204272.20620: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.20621: done checking to see if all hosts have failed 28285 1727204272.20622: getting the remaining hosts for this loop 28285 1727204272.20624: done getting the remaining hosts for this loop 28285 1727204272.20628: getting the next task for host managed-node1 28285 1727204272.20635: done getting next task for host managed-node1 28285 1727204272.20639: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204272.20644: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.20666: getting variables 28285 1727204272.20670: in VariableManager get_vars() 28285 1727204272.20725: Calling all_inventory to load vars for managed-node1 28285 1727204272.20728: Calling groups_inventory to load vars for managed-node1 28285 1727204272.20731: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.20742: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.20744: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.20747: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.20995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.21332: done with get_vars() 28285 1727204272.21345: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.060) 0:00:13.025 ***** 28285 1727204272.21559: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204272.21925: worker is 1 (out of 1 available) 28285 1727204272.21938: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 28285 1727204272.21950: done queuing things up, now waiting for results queue to drain 28285 1727204272.21951: waiting for pending results... 28285 1727204272.22236: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 28285 1727204272.22379: in run() - task 0affcd87-79f5-57a1-d976-000000000171 28285 1727204272.22400: variable 'ansible_search_path' from source: unknown 28285 1727204272.22413: variable 'ansible_search_path' from source: unknown 28285 1727204272.22453: calling self._execute() 28285 1727204272.22550: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.22561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.22577: variable 'omit' from source: magic vars 28285 1727204272.23053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.25687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.25779: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.25828: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.25872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.25914: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.26002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.26041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.26073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.26128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.26147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.26317: variable 'ansible_distribution' from source: facts 28285 1727204272.26337: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.26362: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.26372: when evaluation is False, skipping this task 28285 1727204272.26379: _execute() done 28285 1727204272.26386: dumping result to json 28285 1727204272.26392: done dumping result, returning 28285 1727204272.26404: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-57a1-d976-000000000171] 28285 1727204272.26414: sending task result for task 0affcd87-79f5-57a1-d976-000000000171 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204272.26580: no more pending results, returning what we have 28285 1727204272.26584: results queue empty 28285 1727204272.26585: checking for any_errors_fatal 28285 1727204272.26592: done checking for any_errors_fatal 28285 1727204272.26593: checking for max_fail_percentage 28285 1727204272.26595: done checking for max_fail_percentage 28285 1727204272.26596: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.26597: done checking to see if all hosts have failed 28285 1727204272.26598: getting the remaining hosts for this loop 28285 1727204272.26600: done getting the remaining hosts for this loop 28285 1727204272.26604: getting the next task for host managed-node1 28285 1727204272.26611: done getting next task for host managed-node1 28285 1727204272.26616: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204272.26621: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.26642: getting variables 28285 1727204272.26644: in VariableManager get_vars() 28285 1727204272.26706: Calling all_inventory to load vars for managed-node1 28285 1727204272.26709: Calling groups_inventory to load vars for managed-node1 28285 1727204272.26712: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.26723: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.26726: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.26729: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.26923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.27149: done with get_vars() 28285 1727204272.27162: done getting variables 28285 1727204272.27329: done sending task result for task 0affcd87-79f5-57a1-d976-000000000171 28285 1727204272.27333: WORKER PROCESS EXITING 28285 1727204272.27354: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.058) 0:00:13.083 ***** 28285 1727204272.27398: entering _queue_task() for managed-node1/debug 28285 1727204272.27866: worker is 1 (out of 1 available) 28285 1727204272.27881: exiting _queue_task() for managed-node1/debug 28285 1727204272.27891: done queuing things up, now waiting for results queue to drain 28285 1727204272.27892: waiting for pending results... 28285 1727204272.28168: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28285 1727204272.28315: in run() - task 0affcd87-79f5-57a1-d976-000000000172 28285 1727204272.28337: variable 'ansible_search_path' from source: unknown 28285 1727204272.28343: variable 'ansible_search_path' from source: unknown 28285 1727204272.28383: calling self._execute() 28285 1727204272.28481: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.28491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.28503: variable 'omit' from source: magic vars 28285 1727204272.28966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.31672: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.31724: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.31754: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.31782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.31806: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.31868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.31889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.31911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.31937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.31948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.32065: variable 'ansible_distribution' from source: facts 28285 1727204272.32071: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.32086: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.32089: when evaluation is False, skipping this task 28285 1727204272.32092: _execute() done 28285 1727204272.32095: dumping result to json 28285 1727204272.32097: done dumping result, returning 28285 1727204272.32105: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-57a1-d976-000000000172] 28285 1727204272.32111: sending task result for task 0affcd87-79f5-57a1-d976-000000000172 28285 1727204272.32198: done sending task result for task 0affcd87-79f5-57a1-d976-000000000172 28285 1727204272.32200: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204272.32278: no more pending results, returning what we have 28285 1727204272.32282: results queue empty 28285 1727204272.32283: checking for any_errors_fatal 28285 1727204272.32287: done checking for any_errors_fatal 28285 1727204272.32288: checking for max_fail_percentage 28285 1727204272.32290: done checking for max_fail_percentage 28285 1727204272.32291: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.32292: done checking to see if all hosts have failed 28285 1727204272.32293: getting the remaining hosts for this loop 28285 1727204272.32294: done getting the remaining hosts for this loop 28285 1727204272.32298: getting the next task for host managed-node1 28285 1727204272.32304: done getting next task for host managed-node1 28285 1727204272.32308: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204272.32312: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.32334: getting variables 28285 1727204272.32336: in VariableManager get_vars() 28285 1727204272.32385: Calling all_inventory to load vars for managed-node1 28285 1727204272.32388: Calling groups_inventory to load vars for managed-node1 28285 1727204272.32390: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.32399: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.32400: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.32403: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.32521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.32689: done with get_vars() 28285 1727204272.32697: done getting variables 28285 1727204272.32739: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.053) 0:00:13.137 ***** 28285 1727204272.32767: entering _queue_task() for managed-node1/debug 28285 1727204272.32981: worker is 1 (out of 1 available) 28285 1727204272.32997: exiting _queue_task() for managed-node1/debug 28285 1727204272.33029: done queuing things up, now waiting for results queue to drain 28285 1727204272.33032: waiting for pending results... 28285 1727204272.33314: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28285 1727204272.33485: in run() - task 0affcd87-79f5-57a1-d976-000000000173 28285 1727204272.33514: variable 'ansible_search_path' from source: unknown 28285 1727204272.33521: variable 'ansible_search_path' from source: unknown 28285 1727204272.33569: calling self._execute() 28285 1727204272.33673: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.33684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.33697: variable 'omit' from source: magic vars 28285 1727204272.34222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.36621: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.36681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.36710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.36736: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.36759: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.36819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.36840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.36862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.36891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.36901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.37012: variable 'ansible_distribution' from source: facts 28285 1727204272.37019: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.37036: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.37039: when evaluation is False, skipping this task 28285 1727204272.37041: _execute() done 28285 1727204272.37044: dumping result to json 28285 1727204272.37046: done dumping result, returning 28285 1727204272.37056: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-57a1-d976-000000000173] 28285 1727204272.37062: sending task result for task 0affcd87-79f5-57a1-d976-000000000173 28285 1727204272.37149: done sending task result for task 0affcd87-79f5-57a1-d976-000000000173 28285 1727204272.37153: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204272.37199: no more pending results, returning what we have 28285 1727204272.37202: results queue empty 28285 1727204272.37203: checking for any_errors_fatal 28285 1727204272.37209: done checking for any_errors_fatal 28285 1727204272.37209: checking for max_fail_percentage 28285 1727204272.37211: done checking for max_fail_percentage 28285 1727204272.37212: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.37213: done checking to see if all hosts have failed 28285 1727204272.37213: getting the remaining hosts for this loop 28285 1727204272.37216: done getting the remaining hosts for this loop 28285 1727204272.37220: getting the next task for host managed-node1 28285 1727204272.37227: done getting next task for host managed-node1 28285 1727204272.37231: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204272.37235: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.37256: getting variables 28285 1727204272.37258: in VariableManager get_vars() 28285 1727204272.37309: Calling all_inventory to load vars for managed-node1 28285 1727204272.37311: Calling groups_inventory to load vars for managed-node1 28285 1727204272.37314: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.37322: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.37324: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.37327: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.37456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.37593: done with get_vars() 28285 1727204272.37602: done getting variables 28285 1727204272.37665: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.049) 0:00:13.187 ***** 28285 1727204272.37709: entering _queue_task() for managed-node1/debug 28285 1727204272.38004: worker is 1 (out of 1 available) 28285 1727204272.38017: exiting _queue_task() for managed-node1/debug 28285 1727204272.38029: done queuing things up, now waiting for results queue to drain 28285 1727204272.38031: waiting for pending results... 28285 1727204272.38312: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28285 1727204272.38449: in run() - task 0affcd87-79f5-57a1-d976-000000000174 28285 1727204272.38474: variable 'ansible_search_path' from source: unknown 28285 1727204272.38483: variable 'ansible_search_path' from source: unknown 28285 1727204272.38524: calling self._execute() 28285 1727204272.38690: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.38717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.38750: variable 'omit' from source: magic vars 28285 1727204272.39166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.40859: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.40906: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.40934: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.40963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.40987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.41042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.41066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.41086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.41116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.41126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.41234: variable 'ansible_distribution' from source: facts 28285 1727204272.41240: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.41258: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.41261: when evaluation is False, skipping this task 28285 1727204272.41265: _execute() done 28285 1727204272.41268: dumping result to json 28285 1727204272.41270: done dumping result, returning 28285 1727204272.41277: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-57a1-d976-000000000174] 28285 1727204272.41283: sending task result for task 0affcd87-79f5-57a1-d976-000000000174 28285 1727204272.41376: done sending task result for task 0affcd87-79f5-57a1-d976-000000000174 28285 1727204272.41378: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 28285 1727204272.41440: no more pending results, returning what we have 28285 1727204272.41443: results queue empty 28285 1727204272.41444: checking for any_errors_fatal 28285 1727204272.41449: done checking for any_errors_fatal 28285 1727204272.41450: checking for max_fail_percentage 28285 1727204272.41452: done checking for max_fail_percentage 28285 1727204272.41453: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.41453: done checking to see if all hosts have failed 28285 1727204272.41454: getting the remaining hosts for this loop 28285 1727204272.41456: done getting the remaining hosts for this loop 28285 1727204272.41459: getting the next task for host managed-node1 28285 1727204272.41468: done getting next task for host managed-node1 28285 1727204272.41472: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204272.41477: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.41497: getting variables 28285 1727204272.41499: in VariableManager get_vars() 28285 1727204272.41546: Calling all_inventory to load vars for managed-node1 28285 1727204272.41549: Calling groups_inventory to load vars for managed-node1 28285 1727204272.41551: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.41560: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.41562: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.41566: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.41728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.41853: done with get_vars() 28285 1727204272.41861: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.042) 0:00:13.229 ***** 28285 1727204272.41931: entering _queue_task() for managed-node1/ping 28285 1727204272.42141: worker is 1 (out of 1 available) 28285 1727204272.42154: exiting _queue_task() for managed-node1/ping 28285 1727204272.42168: done queuing things up, now waiting for results queue to drain 28285 1727204272.42170: waiting for pending results... 28285 1727204272.42341: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 28285 1727204272.42443: in run() - task 0affcd87-79f5-57a1-d976-000000000175 28285 1727204272.42454: variable 'ansible_search_path' from source: unknown 28285 1727204272.42458: variable 'ansible_search_path' from source: unknown 28285 1727204272.42493: calling self._execute() 28285 1727204272.42558: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.42562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.42574: variable 'omit' from source: magic vars 28285 1727204272.42884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.44547: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.44603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.44633: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.44664: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.44686: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.44746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.44771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.44791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.44817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.44829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.44932: variable 'ansible_distribution' from source: facts 28285 1727204272.44937: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.44956: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.44959: when evaluation is False, skipping this task 28285 1727204272.44962: _execute() done 28285 1727204272.44966: dumping result to json 28285 1727204272.44968: done dumping result, returning 28285 1727204272.44976: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-57a1-d976-000000000175] 28285 1727204272.44984: sending task result for task 0affcd87-79f5-57a1-d976-000000000175 28285 1727204272.45070: done sending task result for task 0affcd87-79f5-57a1-d976-000000000175 28285 1727204272.45073: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204272.45142: no more pending results, returning what we have 28285 1727204272.45146: results queue empty 28285 1727204272.45146: checking for any_errors_fatal 28285 1727204272.45154: done checking for any_errors_fatal 28285 1727204272.45155: checking for max_fail_percentage 28285 1727204272.45156: done checking for max_fail_percentage 28285 1727204272.45158: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.45158: done checking to see if all hosts have failed 28285 1727204272.45159: getting the remaining hosts for this loop 28285 1727204272.45161: done getting the remaining hosts for this loop 28285 1727204272.45166: getting the next task for host managed-node1 28285 1727204272.45175: done getting next task for host managed-node1 28285 1727204272.45177: ^ task is: TASK: meta (role_complete) 28285 1727204272.45181: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.45205: getting variables 28285 1727204272.45207: in VariableManager get_vars() 28285 1727204272.45257: Calling all_inventory to load vars for managed-node1 28285 1727204272.45260: Calling groups_inventory to load vars for managed-node1 28285 1727204272.45262: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.45273: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.45275: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.45278: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.45403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.45531: done with get_vars() 28285 1727204272.45540: done getting variables 28285 1727204272.45600: done queuing things up, now waiting for results queue to drain 28285 1727204272.45602: results queue empty 28285 1727204272.45602: checking for any_errors_fatal 28285 1727204272.45604: done checking for any_errors_fatal 28285 1727204272.45605: checking for max_fail_percentage 28285 1727204272.45605: done checking for max_fail_percentage 28285 1727204272.45606: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.45606: done checking to see if all hosts have failed 28285 1727204272.45607: getting the remaining hosts for this loop 28285 1727204272.45607: done getting the remaining hosts for this loop 28285 1727204272.45609: getting the next task for host managed-node1 28285 1727204272.45612: done getting next task for host managed-node1 28285 1727204272.45614: ^ task is: TASK: Include the task 'manage_test_interface.yml' 28285 1727204272.45615: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.45617: getting variables 28285 1727204272.45617: in VariableManager get_vars() 28285 1727204272.45638: Calling all_inventory to load vars for managed-node1 28285 1727204272.45640: Calling groups_inventory to load vars for managed-node1 28285 1727204272.45641: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.45644: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.45646: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.45647: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.45731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.45871: done with get_vars() 28285 1727204272.45878: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:215 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.039) 0:00:13.269 ***** 28285 1727204272.45925: entering _queue_task() for managed-node1/include_tasks 28285 1727204272.46143: worker is 1 (out of 1 available) 28285 1727204272.46160: exiting _queue_task() for managed-node1/include_tasks 28285 1727204272.46173: done queuing things up, now waiting for results queue to drain 28285 1727204272.46174: waiting for pending results... 28285 1727204272.46338: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 28285 1727204272.46411: in run() - task 0affcd87-79f5-57a1-d976-0000000001a5 28285 1727204272.46422: variable 'ansible_search_path' from source: unknown 28285 1727204272.46454: calling self._execute() 28285 1727204272.46517: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.46521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.46530: variable 'omit' from source: magic vars 28285 1727204272.46841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.48565: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.48618: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.48648: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.48678: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.48699: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.48760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.48782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.48801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.48829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.48843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.48942: variable 'ansible_distribution' from source: facts 28285 1727204272.48951: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.48968: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.48973: when evaluation is False, skipping this task 28285 1727204272.48976: _execute() done 28285 1727204272.48979: dumping result to json 28285 1727204272.48981: done dumping result, returning 28285 1727204272.48987: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-57a1-d976-0000000001a5] 28285 1727204272.48992: sending task result for task 0affcd87-79f5-57a1-d976-0000000001a5 28285 1727204272.49098: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001a5 28285 1727204272.49101: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204272.49146: no more pending results, returning what we have 28285 1727204272.49150: results queue empty 28285 1727204272.49151: checking for any_errors_fatal 28285 1727204272.49152: done checking for any_errors_fatal 28285 1727204272.49153: checking for max_fail_percentage 28285 1727204272.49155: done checking for max_fail_percentage 28285 1727204272.49156: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.49157: done checking to see if all hosts have failed 28285 1727204272.49157: getting the remaining hosts for this loop 28285 1727204272.49159: done getting the remaining hosts for this loop 28285 1727204272.49165: getting the next task for host managed-node1 28285 1727204272.49172: done getting next task for host managed-node1 28285 1727204272.49181: ^ task is: TASK: Verify network state restored to default 28285 1727204272.49184: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 28285 1727204272.49188: getting variables 28285 1727204272.49190: in VariableManager get_vars() 28285 1727204272.49294: Calling all_inventory to load vars for managed-node1 28285 1727204272.49297: Calling groups_inventory to load vars for managed-node1 28285 1727204272.49300: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.49312: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.49315: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.49318: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.49518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.49746: done with get_vars() 28285 1727204272.49759: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:219 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.039) 0:00:13.308 ***** 28285 1727204272.49868: entering _queue_task() for managed-node1/include_tasks 28285 1727204272.50383: worker is 1 (out of 1 available) 28285 1727204272.50394: exiting _queue_task() for managed-node1/include_tasks 28285 1727204272.50406: done queuing things up, now waiting for results queue to drain 28285 1727204272.50408: waiting for pending results... 28285 1727204272.50680: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 28285 1727204272.50798: in run() - task 0affcd87-79f5-57a1-d976-0000000001a6 28285 1727204272.50819: variable 'ansible_search_path' from source: unknown 28285 1727204272.50870: calling self._execute() 28285 1727204272.50962: variable 'ansible_host' from source: host vars for 'managed-node1' 28285 1727204272.50975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 28285 1727204272.50988: variable 'omit' from source: magic vars 28285 1727204272.51387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28285 1727204272.53175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28285 1727204272.53252: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28285 1727204272.53297: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28285 1727204272.53338: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28285 1727204272.53377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28285 1727204272.53466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28285 1727204272.53499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28285 1727204272.53528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28285 1727204272.53574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28285 1727204272.53592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28285 1727204272.53731: variable 'ansible_distribution' from source: facts 28285 1727204272.53742: variable 'ansible_distribution_major_version' from source: facts 28285 1727204272.53765: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 28285 1727204272.53774: when evaluation is False, skipping this task 28285 1727204272.53780: _execute() done 28285 1727204272.53787: dumping result to json 28285 1727204272.53793: done dumping result, returning 28285 1727204272.53803: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [0affcd87-79f5-57a1-d976-0000000001a6] 28285 1727204272.53812: sending task result for task 0affcd87-79f5-57a1-d976-0000000001a6 28285 1727204272.53926: done sending task result for task 0affcd87-79f5-57a1-d976-0000000001a6 skipping: [managed-node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 28285 1727204272.53978: no more pending results, returning what we have 28285 1727204272.53982: results queue empty 28285 1727204272.53982: checking for any_errors_fatal 28285 1727204272.53989: done checking for any_errors_fatal 28285 1727204272.53990: checking for max_fail_percentage 28285 1727204272.53991: done checking for max_fail_percentage 28285 1727204272.53992: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.53993: done checking to see if all hosts have failed 28285 1727204272.53994: getting the remaining hosts for this loop 28285 1727204272.53995: done getting the remaining hosts for this loop 28285 1727204272.54000: getting the next task for host managed-node1 28285 1727204272.54008: done getting next task for host managed-node1 28285 1727204272.54009: ^ task is: TASK: meta (flush_handlers) 28285 1727204272.54011: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204272.54015: getting variables 28285 1727204272.54016: in VariableManager get_vars() 28285 1727204272.54072: Calling all_inventory to load vars for managed-node1 28285 1727204272.54075: Calling groups_inventory to load vars for managed-node1 28285 1727204272.54077: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.54089: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.54091: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.54093: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.54273: WORKER PROCESS EXITING 28285 1727204272.54280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.54557: done with get_vars() 28285 1727204272.54572: done getting variables 28285 1727204272.54644: in VariableManager get_vars() 28285 1727204272.54672: Calling all_inventory to load vars for managed-node1 28285 1727204272.54674: Calling groups_inventory to load vars for managed-node1 28285 1727204272.54676: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.54681: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.54683: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.54686: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.54842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.55066: done with get_vars() 28285 1727204272.55081: done queuing things up, now waiting for results queue to drain 28285 1727204272.55083: results queue empty 28285 1727204272.55084: checking for any_errors_fatal 28285 1727204272.55086: done checking for any_errors_fatal 28285 1727204272.55087: checking for max_fail_percentage 28285 1727204272.55088: done checking for max_fail_percentage 28285 1727204272.55089: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.55089: done checking to see if all hosts have failed 28285 1727204272.55090: getting the remaining hosts for this loop 28285 1727204272.55091: done getting the remaining hosts for this loop 28285 1727204272.55094: getting the next task for host managed-node1 28285 1727204272.55098: done getting next task for host managed-node1 28285 1727204272.55099: ^ task is: TASK: meta (flush_handlers) 28285 1727204272.55101: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204272.55104: getting variables 28285 1727204272.55105: in VariableManager get_vars() 28285 1727204272.55124: Calling all_inventory to load vars for managed-node1 28285 1727204272.55126: Calling groups_inventory to load vars for managed-node1 28285 1727204272.55128: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.55143: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.55146: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.55151: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.55647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.55840: done with get_vars() 28285 1727204272.55849: done getting variables 28285 1727204272.55898: in VariableManager get_vars() 28285 1727204272.55917: Calling all_inventory to load vars for managed-node1 28285 1727204272.55920: Calling groups_inventory to load vars for managed-node1 28285 1727204272.55922: Calling all_plugins_inventory to load vars for managed-node1 28285 1727204272.55926: Calling all_plugins_play to load vars for managed-node1 28285 1727204272.55928: Calling groups_plugins_inventory to load vars for managed-node1 28285 1727204272.55931: Calling groups_plugins_play to load vars for managed-node1 28285 1727204272.56118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28285 1727204272.56315: done with get_vars() 28285 1727204272.56331: done queuing things up, now waiting for results queue to drain 28285 1727204272.56333: results queue empty 28285 1727204272.56334: checking for any_errors_fatal 28285 1727204272.56335: done checking for any_errors_fatal 28285 1727204272.56336: checking for max_fail_percentage 28285 1727204272.56337: done checking for max_fail_percentage 28285 1727204272.56337: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.56338: done checking to see if all hosts have failed 28285 1727204272.56339: getting the remaining hosts for this loop 28285 1727204272.56340: done getting the remaining hosts for this loop 28285 1727204272.56342: getting the next task for host managed-node1 28285 1727204272.56345: done getting next task for host managed-node1 28285 1727204272.56346: ^ task is: None 28285 1727204272.56347: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28285 1727204272.56351: done queuing things up, now waiting for results queue to drain 28285 1727204272.56352: results queue empty 28285 1727204272.56353: checking for any_errors_fatal 28285 1727204272.56353: done checking for any_errors_fatal 28285 1727204272.56354: checking for max_fail_percentage 28285 1727204272.56355: done checking for max_fail_percentage 28285 1727204272.56356: checking to see if all hosts have failed and the running result is not ok 28285 1727204272.56356: done checking to see if all hosts have failed 28285 1727204272.56358: getting the next task for host managed-node1 28285 1727204272.56361: done getting next task for host managed-node1 28285 1727204272.56362: ^ task is: None 28285 1727204272.56363: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=144 rescued=0 ignored=0 Tuesday 24 September 2024 14:57:52 -0400 (0:00:00.065) 0:00:13.374 ***** =============================================================================== Gathering Facts --------------------------------------------------------- 2.69s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethtool_features_initscripts.yml:5 Gather the minimum subset of ansible_facts required by the network role test --- 0.87s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.65s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Include the task 'enable_epel.yml' -------------------------------------- 0.15s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Install ethtool (test dependency) --------------------------------------- 0.14s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:26 Include the task 'assert_device_present.yml' ---------------------------- 0.13s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:24 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Include the task 'manage_test_interface.yml' ---------------------------- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:20 Get current device features --------------------------------------------- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:53 Configure ethtool features setting -------------------------------------- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:140 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check failure ----------------------------------------------------------- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:170 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 fedora.linux_system_roles.network : Configure networking state ---------- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.11s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 fedora.linux_system_roles.network : Configure networking state ---------- 0.10s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.10s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 INIT: Ethtool feeatures tests ------------------------------------------- 0.10s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethtool_features.yml:15 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.10s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Set network provider to 'initscripts' ----------------------------------- 0.09s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethtool_features_initscripts.yml:12 28285 1727204272.56484: RUNNING CLEANUP