[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 33932 1726882878.89991: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 33932 1726882878.90507: Added group all to inventory 33932 1726882878.90510: Added group ungrouped to inventory 33932 1726882878.90514: Group all now contains ungrouped 33932 1726882878.90517: Examining possible inventory source: /tmp/network-91m/inventory.yml 33932 1726882879.16041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 33932 1726882879.16106: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 33932 1726882879.16129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 33932 1726882879.16192: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 33932 1726882879.16268: Loaded config def from plugin (inventory/script) 33932 1726882879.16270: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 33932 1726882879.16313: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 33932 1726882879.16404: Loaded config def from plugin (inventory/yaml) 33932 1726882879.16406: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 33932 1726882879.16494: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 33932 1726882879.17159: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 33932 1726882879.17163: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 33932 1726882879.17167: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 33932 1726882879.17173: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 33932 1726882879.17178: Loading data from /tmp/network-91m/inventory.yml 33932 1726882879.17246: /tmp/network-91m/inventory.yml was not parsable by auto 33932 1726882879.17314: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 33932 1726882879.17352: Loading data from /tmp/network-91m/inventory.yml 33932 1726882879.17437: group all already in inventory 33932 1726882879.17444: set inventory_file for managed_node1 33932 1726882879.17448: set inventory_dir for managed_node1 33932 1726882879.17449: Added host managed_node1 to inventory 33932 1726882879.17452: Added host managed_node1 to group all 33932 1726882879.17453: set ansible_host for managed_node1 33932 1726882879.17454: set ansible_ssh_extra_args for managed_node1 33932 1726882879.17457: set inventory_file for managed_node2 33932 1726882879.17460: set inventory_dir for managed_node2 33932 1726882879.17460: Added host managed_node2 to inventory 33932 1726882879.17462: Added host managed_node2 to group all 33932 1726882879.17463: set ansible_host for managed_node2 33932 1726882879.17465: set ansible_ssh_extra_args for managed_node2 33932 1726882879.17468: set inventory_file for managed_node3 33932 1726882879.17470: set inventory_dir for managed_node3 33932 1726882879.17471: Added host managed_node3 to inventory 33932 1726882879.17472: Added host managed_node3 to group all 33932 1726882879.17473: set ansible_host for managed_node3 33932 1726882879.17474: set ansible_ssh_extra_args for managed_node3 33932 1726882879.17476: Reconcile groups and hosts in inventory. 33932 1726882879.17481: Group ungrouped now contains managed_node1 33932 1726882879.17483: Group ungrouped now contains managed_node2 33932 1726882879.17485: Group ungrouped now contains managed_node3 33932 1726882879.17562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 33932 1726882879.18280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 33932 1726882879.18343: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 33932 1726882879.18477: Loaded config def from plugin (vars/host_group_vars) 33932 1726882879.18480: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 33932 1726882879.18487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 33932 1726882879.18495: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 33932 1726882879.18537: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 33932 1726882879.19333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882879.19429: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 33932 1726882879.19462: Loaded config def from plugin (connection/local) 33932 1726882879.19467: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 33932 1726882879.20042: Loaded config def from plugin (connection/paramiko_ssh) 33932 1726882879.20046: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 33932 1726882879.20967: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33932 1726882879.21005: Loaded config def from plugin (connection/psrp) 33932 1726882879.21008: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 33932 1726882879.24246: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33932 1726882879.24648: Loaded config def from plugin (connection/ssh) 33932 1726882879.24652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 33932 1726882879.26859: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33932 1726882879.26899: Loaded config def from plugin (connection/winrm) 33932 1726882879.26903: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 33932 1726882879.26935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 33932 1726882879.26997: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 33932 1726882879.27066: Loaded config def from plugin (shell/cmd) 33932 1726882879.27068: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 33932 1726882879.27094: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 33932 1726882879.27160: Loaded config def from plugin (shell/powershell) 33932 1726882879.27162: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 33932 1726882879.27216: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 33932 1726882879.27382: Loaded config def from plugin (shell/sh) 33932 1726882879.27385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 33932 1726882879.27419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 33932 1726882879.27913: Loaded config def from plugin (become/runas) 33932 1726882879.27916: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 33932 1726882879.28340: Loaded config def from plugin (become/su) 33932 1726882879.28342: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 33932 1726882879.28911: Loaded config def from plugin (become/sudo) 33932 1726882879.28943: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 33932 1726882879.29004: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 33932 1726882879.30047: in VariableManager get_vars() 33932 1726882879.30071: done with get_vars() 33932 1726882879.30339: trying /usr/local/lib/python3.12/site-packages/ansible/modules 33932 1726882879.35504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 33932 1726882879.35625: in VariableManager get_vars() 33932 1726882879.35630: done with get_vars() 33932 1726882879.35633: variable 'playbook_dir' from source: magic vars 33932 1726882879.35634: variable 'ansible_playbook_python' from source: magic vars 33932 1726882879.35634: variable 'ansible_config_file' from source: magic vars 33932 1726882879.35635: variable 'groups' from source: magic vars 33932 1726882879.35636: variable 'omit' from source: magic vars 33932 1726882879.35637: variable 'ansible_version' from source: magic vars 33932 1726882879.35637: variable 'ansible_check_mode' from source: magic vars 33932 1726882879.35638: variable 'ansible_diff_mode' from source: magic vars 33932 1726882879.35639: variable 'ansible_forks' from source: magic vars 33932 1726882879.35639: variable 'ansible_inventory_sources' from source: magic vars 33932 1726882879.35640: variable 'ansible_skip_tags' from source: magic vars 33932 1726882879.35641: variable 'ansible_limit' from source: magic vars 33932 1726882879.35641: variable 'ansible_run_tags' from source: magic vars 33932 1726882879.35642: variable 'ansible_verbosity' from source: magic vars 33932 1726882879.35679: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml 33932 1726882879.36224: in VariableManager get_vars() 33932 1726882879.36267: done with get_vars() 33932 1726882879.36355: in VariableManager get_vars() 33932 1726882879.36372: done with get_vars() 33932 1726882879.36411: in VariableManager get_vars() 33932 1726882879.36459: done with get_vars() 33932 1726882879.36891: in VariableManager get_vars() 33932 1726882879.36904: done with get_vars() 33932 1726882879.36909: variable 'omit' from source: magic vars 33932 1726882879.36927: variable 'omit' from source: magic vars 33932 1726882879.37070: in VariableManager get_vars() 33932 1726882879.37082: done with get_vars() 33932 1726882879.37145: in VariableManager get_vars() 33932 1726882879.37158: done with get_vars() 33932 1726882879.37197: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33932 1726882879.37805: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33932 1726882879.37931: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33932 1726882879.39380: in VariableManager get_vars() 33932 1726882879.39399: done with get_vars() 33932 1726882879.40266: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 33932 1726882879.40426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33932 1726882879.42895: in VariableManager get_vars() 33932 1726882879.42914: done with get_vars() 33932 1726882879.42953: in VariableManager get_vars() 33932 1726882879.42990: done with get_vars() 33932 1726882879.43217: in VariableManager get_vars() 33932 1726882879.43234: done with get_vars() 33932 1726882879.43238: variable 'omit' from source: magic vars 33932 1726882879.43249: variable 'omit' from source: magic vars 33932 1726882879.43286: in VariableManager get_vars() 33932 1726882879.43300: done with get_vars() 33932 1726882879.43320: in VariableManager get_vars() 33932 1726882879.43334: done with get_vars() 33932 1726882879.43366: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33932 1726882879.43486: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33932 1726882879.43577: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33932 1726882879.44502: in VariableManager get_vars() 33932 1726882879.44524: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33932 1726882879.47235: in VariableManager get_vars() 33932 1726882879.47257: done with get_vars() 33932 1726882879.47409: in VariableManager get_vars() 33932 1726882879.47429: done with get_vars() 33932 1726882879.47485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 33932 1726882879.47500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 33932 1726882879.47752: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 33932 1726882879.47995: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 33932 1726882879.47998: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 33932 1726882879.48029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 33932 1726882879.48053: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 33932 1726882879.50826: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 33932 1726882879.50888: Loaded config def from plugin (callback/default) 33932 1726882879.50891: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33932 1726882879.52599: Loaded config def from plugin (callback/junit) 33932 1726882879.52602: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33932 1726882879.52654: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 33932 1726882879.52744: Loaded config def from plugin (callback/minimal) 33932 1726882879.52746: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33932 1726882879.52929: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 33932 1726882879.53193: Loaded config def from plugin (callback/tree) 33932 1726882879.53195: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 33932 1726882879.53315: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 33932 1726882879.53317: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_vlan_mtu_nm.yml ************************************************ 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 33932 1726882879.53342: in VariableManager get_vars() 33932 1726882879.53356: done with get_vars() 33932 1726882879.53362: in VariableManager get_vars() 33932 1726882879.53373: done with get_vars() 33932 1726882879.53377: variable 'omit' from source: magic vars 33932 1726882879.53413: in VariableManager get_vars() 33932 1726882879.53427: done with get_vars() 33932 1726882879.53449: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_vlan_mtu.yml' with nm as provider] ********* 33932 1726882879.54138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 33932 1726882879.54213: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 33932 1726882879.54241: getting the remaining hosts for this loop 33932 1726882879.54243: done getting the remaining hosts for this loop 33932 1726882879.54246: getting the next task for host managed_node1 33932 1726882879.54249: done getting next task for host managed_node1 33932 1726882879.54251: ^ task is: TASK: Gathering Facts 33932 1726882879.54252: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882879.54255: getting variables 33932 1726882879.54256: in VariableManager get_vars() 33932 1726882879.54266: Calling all_inventory to load vars for managed_node1 33932 1726882879.54269: Calling groups_inventory to load vars for managed_node1 33932 1726882879.54272: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882879.54284: Calling all_plugins_play to load vars for managed_node1 33932 1726882879.54296: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882879.54299: Calling groups_plugins_play to load vars for managed_node1 33932 1726882879.54333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882879.54389: done with get_vars() 33932 1726882879.54395: done getting variables 33932 1726882879.54456: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Friday 20 September 2024 21:41:19 -0400 (0:00:00.012) 0:00:00.012 ****** 33932 1726882879.54479: entering _queue_task() for managed_node1/gather_facts 33932 1726882879.54480: Creating lock for gather_facts 33932 1726882879.54842: worker is 1 (out of 1 available) 33932 1726882879.54853: exiting _queue_task() for managed_node1/gather_facts 33932 1726882879.54867: done queuing things up, now waiting for results queue to drain 33932 1726882879.54869: waiting for pending results... 33932 1726882879.55101: running TaskExecutor() for managed_node1/TASK: Gathering Facts 33932 1726882879.55192: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000af 33932 1726882879.55211: variable 'ansible_search_path' from source: unknown 33932 1726882879.55252: calling self._execute() 33932 1726882879.55313: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882879.55327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882879.55339: variable 'omit' from source: magic vars 33932 1726882879.55438: variable 'omit' from source: magic vars 33932 1726882879.55470: variable 'omit' from source: magic vars 33932 1726882879.55580: variable 'omit' from source: magic vars 33932 1726882879.55626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882879.55797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882879.55820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882879.55842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882879.55857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882879.55896: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882879.55987: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882879.55996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882879.56230: Set connection var ansible_shell_executable to /bin/sh 33932 1726882879.56242: Set connection var ansible_timeout to 10 33932 1726882879.56250: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882879.56259: Set connection var ansible_pipelining to False 33932 1726882879.56268: Set connection var ansible_connection to ssh 33932 1726882879.56274: Set connection var ansible_shell_type to sh 33932 1726882879.56310: variable 'ansible_shell_executable' from source: unknown 33932 1726882879.56319: variable 'ansible_connection' from source: unknown 33932 1726882879.56327: variable 'ansible_module_compression' from source: unknown 33932 1726882879.56419: variable 'ansible_shell_type' from source: unknown 33932 1726882879.56429: variable 'ansible_shell_executable' from source: unknown 33932 1726882879.56437: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882879.56445: variable 'ansible_pipelining' from source: unknown 33932 1726882879.56451: variable 'ansible_timeout' from source: unknown 33932 1726882879.56458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882879.56882: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882879.56898: variable 'omit' from source: magic vars 33932 1726882879.56907: starting attempt loop 33932 1726882879.56915: running the handler 33932 1726882879.56941: variable 'ansible_facts' from source: unknown 33932 1726882879.56974: _low_level_execute_command(): starting 33932 1726882879.56987: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882879.57703: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882879.57723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882879.57739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882879.57756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882879.57799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882879.57812: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882879.57831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882879.57850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882879.57862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882879.57879: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882879.57892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882879.57905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882879.57920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882879.57935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882879.57946: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882879.57959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882879.58036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882879.58066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882879.58084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882879.58210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882879.59891: stdout chunk (state=3): >>>/root <<< 33932 1726882879.60021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882879.60079: stderr chunk (state=3): >>><<< 33932 1726882879.60082: stdout chunk (state=3): >>><<< 33932 1726882879.60183: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882879.60186: _low_level_execute_command(): starting 33932 1726882879.60189: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830 `" && echo ansible-tmp-1726882879.6009996-33970-176995712533830="` echo /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830 `" ) && sleep 0' 33932 1726882879.60892: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882879.60902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882879.60943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882879.60946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882879.60949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882879.60951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882879.60953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882879.60955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882879.61020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882879.61023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882879.61026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882879.61132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882879.62995: stdout chunk (state=3): >>>ansible-tmp-1726882879.6009996-33970-176995712533830=/root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830 <<< 33932 1726882879.63106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882879.63171: stderr chunk (state=3): >>><<< 33932 1726882879.63174: stdout chunk (state=3): >>><<< 33932 1726882879.63372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882879.6009996-33970-176995712533830=/root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882879.63376: variable 'ansible_module_compression' from source: unknown 33932 1726882879.63378: ANSIBALLZ: Using generic lock for ansible.legacy.setup 33932 1726882879.63380: ANSIBALLZ: Acquiring lock 33932 1726882879.63383: ANSIBALLZ: Lock acquired: 140301144901104 33932 1726882879.63385: ANSIBALLZ: Creating module 33932 1726882880.02867: ANSIBALLZ: Writing module into payload 33932 1726882880.03057: ANSIBALLZ: Writing module 33932 1726882880.03084: ANSIBALLZ: Renaming module 33932 1726882880.03095: ANSIBALLZ: Done creating module 33932 1726882880.03128: variable 'ansible_facts' from source: unknown 33932 1726882880.03135: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882880.03144: _low_level_execute_command(): starting 33932 1726882880.03155: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 33932 1726882880.04735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882880.04744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.04758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.04786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.04828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.04838: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882880.04848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.04861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882880.04873: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882880.04877: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882880.04886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.04895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.04907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.04918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.04925: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882880.04934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.05010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882880.05031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882880.05043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882880.05182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882880.07033: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 33932 1726882880.07378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882880.07381: stdout chunk (state=3): >>><<< 33932 1726882880.07390: stderr chunk (state=3): >>><<< 33932 1726882880.07408: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882880.07418 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 33932 1726882880.07461: _low_level_execute_command(): starting 33932 1726882880.07466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 33932 1726882880.07871: Sending initial data 33932 1726882880.07874: Sent initial data (1181 bytes) 33932 1726882880.08538: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882880.08553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.08573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.08596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.08639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.08651: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882880.08670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.08691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882880.08708: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882880.08720: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882880.08734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.08747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.08762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.08780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.08792: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882880.08805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.08889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882880.08910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882880.08930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882880.09058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882880.12817: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 33932 1726882880.13698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882880.13702: stdout chunk (state=3): >>><<< 33932 1726882880.13704: stderr chunk (state=3): >>><<< 33932 1726882880.13707: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882880.13709: variable 'ansible_facts' from source: unknown 33932 1726882880.13711: variable 'ansible_facts' from source: unknown 33932 1726882880.13713: variable 'ansible_module_compression' from source: unknown 33932 1726882880.13715: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 33932 1726882880.13716: variable 'ansible_facts' from source: unknown 33932 1726882880.13719: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830/AnsiballZ_setup.py 33932 1726882880.13788: Sending initial data 33932 1726882880.13791: Sent initial data (154 bytes) 33932 1726882880.14745: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882880.14759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.14779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.14798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.14855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.14874: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882880.14889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.14915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882880.14930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882880.14944: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882880.14958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.14983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.15000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.15026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.15039: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882880.15053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.15149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882880.15170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882880.15197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882880.15837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882880.17538: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882880.17652: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882880.17797: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpvfwrukc8 /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830/AnsiballZ_setup.py <<< 33932 1726882880.17938: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882880.20851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882880.21072: stderr chunk (state=3): >>><<< 33932 1726882880.21075: stdout chunk (state=3): >>><<< 33932 1726882880.21078: done transferring module to remote 33932 1726882880.21080: _low_level_execute_command(): starting 33932 1726882880.21083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830/ /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830/AnsiballZ_setup.py && sleep 0' 33932 1726882880.22388: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882880.22521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.22536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.22552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.22766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.22807: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882880.22850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.22938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882880.22996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882880.23017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882880.23047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.23061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.23082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.23094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.23104: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882880.23118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.23240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882880.23357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882880.23400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882880.23551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882880.25356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882880.25454: stderr chunk (state=3): >>><<< 33932 1726882880.25462: stdout chunk (state=3): >>><<< 33932 1726882880.25492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882880.25503: _low_level_execute_command(): starting 33932 1726882880.25512: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830/AnsiballZ_setup.py && sleep 0' 33932 1726882880.26233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882880.26243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.26258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.26299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.26357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.26373: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882880.26422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.26484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882880.26493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882880.26499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882880.26509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.26542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.26554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.26561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.26572: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882880.26581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.26780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882880.26797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882880.26808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882880.26944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882880.28926: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 33932 1726882880.28952: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 33932 1726882880.29005: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 33932 1726882880.29050: stdout chunk (state=3): >>>import 'posix' # <<< 33932 1726882880.29088: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 33932 1726882880.29134: stdout chunk (state=3): >>>import 'time' # <<< 33932 1726882880.29141: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 33932 1726882880.29192: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.29229: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 33932 1726882880.29243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 33932 1726882880.29274: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa8d8dc0> <<< 33932 1726882880.29324: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 33932 1726882880.29338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa8d8b20> <<< 33932 1726882880.29382: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa8d8ac0> <<< 33932 1726882880.29423: stdout chunk (state=3): >>>import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 33932 1726882880.29443: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d490> <<< 33932 1726882880.29475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 33932 1726882880.29501: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d670> <<< 33932 1726882880.29538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 33932 1726882880.29567: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 33932 1726882880.29613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 33932 1726882880.29672: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa834190> <<< 33932 1726882880.29704: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 33932 1726882880.29809: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa834220> <<< 33932 1726882880.29813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 33932 1726882880.29815: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 33932 1726882880.29833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa857850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa834940> <<< 33932 1726882880.29872: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa895880> <<< 33932 1726882880.29889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 33932 1726882880.29892: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa82dd90> <<< 33932 1726882880.29953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 33932 1726882880.29956: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa857d90> <<< 33932 1726882880.29994: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d970> <<< 33932 1726882880.30025: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33932 1726882880.30500: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d3eb0> <<< 33932 1726882880.30556: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d6f40> <<< 33932 1726882880.30560: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 33932 1726882880.30743: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 33932 1726882880.30757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 33932 1726882880.30801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 33932 1726882880.30840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.30845: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 33932 1726882880.30877: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa4b9dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b98b0> <<< 33932 1726882880.30916: stdout chunk (state=3): >>>import 'itertools' # <<< 33932 1726882880.30939: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b9eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 33932 1726882880.32250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b9f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b9e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5aed30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5a7610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5dae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa4cbc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5ae250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa5bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5e09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cbfa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cbd90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cbd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa49e370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa49e460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4d3fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cda30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cd490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3d21c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa489c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cdeb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5e0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3e4af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3e4e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3f6730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3f6c70> <<< 33932 1726882880.32376: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa38e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3e4f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 33932 1726882880.32435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa39f280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3f65b0> import 'pwd' # <<< 33932 1726882880.32503: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa39f340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cb9d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 33932 1726882880.32557: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 33932 1726882880.32584: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3ba6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 33932 1726882880.32620: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3ba970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3ba760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3ba850> <<< 33932 1726882880.32731: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 33932 1726882880.32878: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3baca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3c71f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3ba8e0> <<< 33932 1726882880.32901: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3aea30> <<< 33932 1726882880.32920: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cb5b0> <<< 33932 1726882880.32949: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 33932 1726882880.32999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 33932 1726882880.33038: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3baa90> <<< 33932 1726882880.33192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 33932 1726882880.33204: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f85aa2e4670> <<< 33932 1726882880.33476: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 33932 1726882880.33563: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.33618: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 33932 1726882880.33644: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 33932 1726882880.33647: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.34854: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.35782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 33932 1726882880.35788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1787f0> <<< 33932 1726882880.35807: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.35824: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 33932 1726882880.35850: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 33932 1726882880.36446: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa209760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa209400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa2097c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1e27c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1e2b50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1e29a0> <<< 33932 1726882880.36627: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9bc74f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa202d30> <<< 33932 1726882880.37431: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa202190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa233a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1d6190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1d6790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9bcdd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1d66a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa257d30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1599a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa262e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1690d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa262e20> <<< 33932 1726882880.37470: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 33932 1726882880.37572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.37602: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 33932 1726882880.37706: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa269220> <<< 33932 1726882880.37926: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa169100> <<< 33932 1726882880.38085: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa22db80> <<< 33932 1726882880.38152: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882880.38155: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa262ac0> <<< 33932 1726882880.38265: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa262d00> <<< 33932 1726882880.38273: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa2e4820> <<< 33932 1726882880.38326: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 33932 1726882880.38352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 33932 1726882880.38381: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 33932 1726882880.38394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 33932 1726882880.38509: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882880.38519: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1650d0> <<< 33932 1726882880.38715: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa15b370> <<< 33932 1726882880.38719: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa165d00> <<< 33932 1726882880.38765: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1656a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa166130> # zipimport: zlib available <<< 33932 1726882880.38785: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 33932 1726882880.38851: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.38942: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 33932 1726882880.38984: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 33932 1726882880.38987: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.39089: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.39176: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.39655: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.40328: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 33932 1726882880.40343: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 33932 1726882880.40361: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 33932 1726882880.40383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.40451: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1a18b0> <<< 33932 1726882880.40544: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 33932 1726882880.40570: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1a6910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97c96a0> <<< 33932 1726882880.40625: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 33932 1726882880.40634: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.40649: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.40686: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 33932 1726882880.40688: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.40875: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.41674: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1e07f0> # zipimport: zlib available <<< 33932 1726882880.41744: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.42345: stdout chunk (state=3): >>># zipimport: zlib available<<< 33932 1726882880.42350: stdout chunk (state=3): >>> <<< 33932 1726882880.42438: stdout chunk (state=3): >>># zipimport: zlib available<<< 33932 1726882880.42443: stdout chunk (state=3): >>> <<< 33932 1726882880.42544: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 33932 1726882880.42593: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.42643: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 33932 1726882880.42647: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.42722: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.42830: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 33932 1726882880.42833: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.42835: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.42876: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 33932 1726882880.42889: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.42925: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 33932 1726882880.43120: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 33932 1726882880.43340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 33932 1726882880.43419: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97ced90> # zipimport: zlib available <<< 33932 1726882880.43481: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43565: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 33932 1726882880.43580: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 33932 1726882880.43596: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43614: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43650: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 33932 1726882880.43661: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43686: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43733: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43824: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.43885: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 33932 1726882880.43907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.43980: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882880.43994: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1950a0> <<< 33932 1726882880.44108: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9795070> <<< 33932 1726882880.44128: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 33932 1726882880.44205: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44237: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44249: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44294: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 33932 1726882880.44317: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 33932 1726882880.44345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 33932 1726882880.44371: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 33932 1726882880.44389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 33932 1726882880.44473: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa19d160> <<< 33932 1726882880.44513: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa19acd0> <<< 33932 1726882880.44575: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97cebb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 33932 1726882880.44596: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44625: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 33932 1726882880.44699: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 33932 1726882880.44736: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 33932 1726882880.44751: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44793: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44874: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33932 1726882880.44889: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44916: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44959: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.44984: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.45022: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 33932 1726882880.45089: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.45152: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.45182: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.45211: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 33932 1726882880.45360: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.45495: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.45526: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.45584: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.45617: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 33932 1726882880.45635: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 33932 1726882880.45676: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9549a60> <<< 33932 1726882880.45695: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 33932 1726882880.45707: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 33932 1726882880.45763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 33932 1726882880.45784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97a86d0> <<< 33932 1726882880.45818: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a97a8af0> <<< 33932 1726882880.45900: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a978f250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a978fa30> <<< 33932 1726882880.45942: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97de460> <<< 33932 1726882880.45945: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97de910> <<< 33932 1726882880.45983: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 33932 1726882880.46001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 33932 1726882880.46005: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 33932 1726882880.46047: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a97dbd00> <<< 33932 1726882880.46072: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97dbd60> <<< 33932 1726882880.46095: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 33932 1726882880.46105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97db250> <<< 33932 1726882880.46125: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 33932 1726882880.46139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 33932 1726882880.46170: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a95b1f70> <<< 33932 1726882880.46213: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97f34c0> <<< 33932 1726882880.46252: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97de310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 33932 1726882880.46285: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 33932 1726882880.46302: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46337: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46385: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 33932 1726882880.46431: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46488: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 33932 1726882880.46502: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 33932 1726882880.46531: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46548: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46567: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 33932 1726882880.46612: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46665: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 33932 1726882880.46668: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46702: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46740: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 33932 1726882880.46794: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46849: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46892: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.46944: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 33932 1726882880.46962: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 33932 1726882880.47344: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.47708: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 33932 1726882880.47755: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.47804: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.47836: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.47888: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 33932 1726882880.47902: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.47936: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 33932 1726882880.47939: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.47987: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.48035: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 33932 1726882880.48084: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.48118: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 33932 1726882880.48158: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 33932 1726882880.48161: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.48239: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.48309: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 33932 1726882880.48362: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94c9ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 33932 1726882880.48367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 33932 1726882880.48534: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94c9fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 33932 1726882880.49403: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.49422: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a94b5370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a950cbb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 33932 1726882880.49468: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.49527: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 33932 1726882880.49608: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.49657: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.49757: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.49888: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 33932 1726882880.49923: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50011: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available <<< 33932 1726882880.50047: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 33932 1726882880.50130: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882880.50150: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a9444160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94442b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 33932 1726882880.50153: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 33932 1726882880.50183: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50236: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 33932 1726882880.50359: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50484: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 33932 1726882880.50581: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50654: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50692: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50734: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 33932 1726882880.50738: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 33932 1726882880.50821: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50835: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.50947: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.51074: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 33932 1726882880.51179: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.51284: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 33932 1726882880.51314: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.51345: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.51784: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52210: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 33932 1726882880.52215: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52288: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52379: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 33932 1726882880.52462: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52558: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 33932 1726882880.52561: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52678: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52827: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 33932 1726882880.52831: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 33932 1726882880.52850: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52875: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.52920: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 33932 1726882880.52923: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53005: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53090: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53250: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53431: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 33932 1726882880.53434: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53454: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53485: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 33932 1726882880.53532: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53537: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53550: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 33932 1726882880.53604: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53678: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 33932 1726882880.53681: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53704: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53718: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 33932 1726882880.53767: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53825: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 33932 1726882880.53829: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53873: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.53958: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 33932 1726882880.53969: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54197: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54368: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 33932 1726882880.54372: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54415: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54470: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 33932 1726882880.54473: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54495: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54534: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 33932 1726882880.54559: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54603: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 33932 1726882880.54606: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54643: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54671: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 33932 1726882880.54674: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54733: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54817: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 33932 1726882880.54840: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 33932 1726882880.54877: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54939: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 33932 1726882880.54942: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.54971: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33932 1726882880.55006: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55050: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55108: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55195: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 33932 1726882880.55198: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55226: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55280: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 33932 1726882880.55447: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55603: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 33932 1726882880.55608: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55641: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55684: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 33932 1726882880.55698: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55735: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55783: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 33932 1726882880.55787: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55851: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.55921: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 33932 1726882880.55930: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 33932 1726882880.56000: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.56089: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py <<< 33932 1726882880.56092: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 33932 1726882880.56153: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882880.56340: stdout chunk (state=3): >>>import 'gc' # <<< 33932 1726882880.57217: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 33932 1726882880.57224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 33932 1726882880.57269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 33932 1726882880.57284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 33932 1726882880.57311: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a9495eb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94951f0> <<< 33932 1726882880.57366: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a92adc70> <<< 33932 1726882880.64688: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 33932 1726882880.64709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 33932 1726882880.64740: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9495100> <<< 33932 1726882880.64783: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 33932 1726882880.64786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 33932 1726882880.64812: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94445e0> <<< 33932 1726882880.64894: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 33932 1726882880.64904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882880.64954: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a92a42e0> <<< 33932 1726882880.64957: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a943f8b0> <<< 33932 1726882880.65452: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 33932 1726882880.86892: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJ<<< 33932 1726882880.86929: stdout chunk (state=3): >>>PDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type"<<< 33932 1726882880.87176: stdout chunk (state=3): >>>: "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM<<< 33932 1726882880.87209: stdout chunk (state=3): >>> domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1038, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234647552, "block_size": 4096, "block_total": 65519355, "block_available": 64510412, "block_used": 1008943, "inode_total": 131071472, "inode_available": 130998693, "inode_used": 72779, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "20", "epoch": "1726882880", "epoch_int": "1726882880", "date": "2024-09-20", "time": "21:41:20", "iso8601_micro": "2024-09-21T01:41:20.862153Z", "iso8601": "2024-09-21T01:41:20Z", "iso8601_basic": "20240920T214120862153", "iso8601_basic_short": "20240920T214120", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.51, "5m": 0.52, "15m": 0.32}, "ansible_lsb": {}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3,<<< 33932 1726882880.87218: stdout chunk (state=3): >>> "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 33932 1726882880.87779: stdout chunk (state=3): >>># clear builtins._ <<< 33932 1726882880.87785: stdout chunk (state=3): >>># clear sys.path # clear sys.argv<<< 33932 1726882880.87788: stdout chunk (state=3): >>> # clear sys.ps1 <<< 33932 1726882880.87801: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_type # clear sys.last_value<<< 33932 1726882880.87822: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.path_hooks <<< 33932 1726882880.87950: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 33932 1726882880.88061: stdout chunk (state=3): >>> <<< 33932 1726882880.88110: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ <<< 33932 1726882880.88124: stdout chunk (state=3): >>># restore sys.stdin<<< 33932 1726882880.88130: stdout chunk (state=3): >>> # restore sys.stdout <<< 33932 1726882880.88133: stdout chunk (state=3): >>># restore sys.stderr <<< 33932 1726882880.88240: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib<<< 33932 1726882880.88258: stdout chunk (state=3): >>> # cleanup[2] removing _imp<<< 33932 1726882880.88301: stdout chunk (state=3): >>> # cleanup[2] removing _thread <<< 33932 1726882880.88905: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io <<< 33932 1726882880.88910: stdout chunk (state=3): >>># cleanup[2] removing marshal<<< 33932 1726882880.88913: stdout chunk (state=3): >>> # cleanup[2] removing posix <<< 33932 1726882880.88915: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external<<< 33932 1726882880.88923: stdout chunk (state=3): >>> # cleanup[2] removing time<<< 33932 1726882880.88928: stdout chunk (state=3): >>> # cleanup[2] removing zipimport <<< 33932 1726882880.88930: stdout chunk (state=3): >>># cleanup[2] removing _codecs <<< 33932 1726882880.88933: stdout chunk (state=3): >>># cleanup[2] removing codecs <<< 33932 1726882880.88935: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases<<< 33932 1726882880.88940: stdout chunk (state=3): >>> # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8<<< 33932 1726882880.88942: stdout chunk (state=3): >>> <<< 33932 1726882880.88946: stdout chunk (state=3): >>># cleanup[2] removing _signal <<< 33932 1726882880.88948: stdout chunk (state=3): >>># cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc<<< 33932 1726882880.88952: stdout chunk (state=3): >>> <<< 33932 1726882880.88955: stdout chunk (state=3): >>># cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil<<< 33932 1726882880.88981: stdout chunk (state=3): >>> # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors<<< 33932 1726882880.89016: stdout chunk (state=3): >>> # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime<<< 33932 1726882880.89049: stdout chunk (state=3): >>> # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader<<< 33932 1726882880.89056: stdout chunk (state=3): >>> # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text<<< 33932 1726882880.89058: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six<<< 33932 1726882880.89086: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux<<< 33932 1726882880.89113: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 33932 1726882880.89151: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale<<< 33932 1726882880.89154: stdout chunk (state=3): >>> # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils<<< 33932 1726882880.89174: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle<<< 33932 1726882880.89204: stdout chunk (state=3): >>> # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing<<< 33932 1726882880.89218: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot<<< 33932 1726882880.89249: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform<<< 33932 1726882880.89253: stdout chunk (state=3): >>> # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios<<< 33932 1726882880.89272: stdout chunk (state=3): >>> # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd<<< 33932 1726882880.89284: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd<<< 33932 1726882880.89306: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux<<< 33932 1726882880.89340: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system <<< 33932 1726882880.89366: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg <<< 33932 1726882880.89381: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos<<< 33932 1726882880.89423: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux<<< 33932 1726882880.89426: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux<<< 33932 1726882880.89448: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize <<< 33932 1726882880.89661: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 33932 1726882880.90503: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 33932 1726882880.90507: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 33932 1726882880.90509: stdout chunk (state=3): >>># destroy selinux<<< 33932 1726882880.90543: stdout chunk (state=3): >>> # destroy distro # destroy logging <<< 33932 1726882880.90579: stdout chunk (state=3): >>># destroy argparse <<< 33932 1726882880.90601: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors<<< 33932 1726882880.90606: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector <<< 33932 1726882880.90609: stdout chunk (state=3): >>># destroy multiprocessing<<< 33932 1726882880.90611: stdout chunk (state=3): >>> # destroy multiprocessing.queues<<< 33932 1726882880.90613: stdout chunk (state=3): >>> # destroy multiprocessing.synchronize # destroy multiprocessing.dummy <<< 33932 1726882880.90616: stdout chunk (state=3): >>># destroy multiprocessing.pool <<< 33932 1726882880.90672: stdout chunk (state=3): >>># destroy pickle <<< 33932 1726882880.90675: stdout chunk (state=3): >>># destroy _compat_pickle <<< 33932 1726882880.90679: stdout chunk (state=3): >>># destroy queue<<< 33932 1726882880.90681: stdout chunk (state=3): >>> <<< 33932 1726882880.90683: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 33932 1726882880.90685: stdout chunk (state=3): >>># destroy shlex<<< 33932 1726882880.90687: stdout chunk (state=3): >>> # destroy datetime<<< 33932 1726882880.90689: stdout chunk (state=3): >>> <<< 33932 1726882880.90782: stdout chunk (state=3): >>># destroy base64<<< 33932 1726882880.90811: stdout chunk (state=3): >>> <<< 33932 1726882880.90844: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 33932 1726882880.90912: stdout chunk (state=3): >>> # destroy getpass<<< 33932 1726882880.90960: stdout chunk (state=3): >>> <<< 33932 1726882880.90975: stdout chunk (state=3): >>># destroy json <<< 33932 1726882880.90978: stdout chunk (state=3): >>># destroy socket<<< 33932 1726882880.90980: stdout chunk (state=3): >>> # destroy struct <<< 33932 1726882880.90982: stdout chunk (state=3): >>># destroy glob<<< 33932 1726882880.90984: stdout chunk (state=3): >>> <<< 33932 1726882880.90986: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout <<< 33932 1726882880.90988: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection <<< 33932 1726882880.90990: stdout chunk (state=3): >>># destroy tempfile<<< 33932 1726882880.90992: stdout chunk (state=3): >>> # destroy multiprocessing.context <<< 33932 1726882880.91106: stdout chunk (state=3): >>># destroy multiprocessing.process<<< 33932 1726882880.91171: stdout chunk (state=3): >>> # destroy multiprocessing.util <<< 33932 1726882880.91217: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 33932 1726882880.91254: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing <<< 33932 1726882880.91302: stdout chunk (state=3): >>># cleanup[3] wiping _queue <<< 33932 1726882880.91324: stdout chunk (state=3): >>># cleanup[3] wiping _pickle <<< 33932 1726882880.91372: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 33932 1726882880.91419: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian <<< 33932 1726882880.91476: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 33932 1726882880.91537: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser<<< 33932 1726882880.91557: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon <<< 33932 1726882880.91597: stdout chunk (state=3): >>># cleanup[3] wiping _socket <<< 33932 1726882880.91624: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 33932 1726882880.91717: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 33932 1726882880.91744: stdout chunk (state=3): >>># cleanup[3] wiping tokenize <<< 33932 1726882880.91803: stdout chunk (state=3): >>># cleanup[3] wiping platform <<< 33932 1726882880.91825: stdout chunk (state=3): >>># destroy subprocess <<< 33932 1726882880.91885: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select<<< 33932 1726882880.91921: stdout chunk (state=3): >>> <<< 33932 1726882880.91978: stdout chunk (state=3): >>># cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal<<< 33932 1726882880.91982: stdout chunk (state=3): >>> # cleanup[3] wiping fcntl # cleanup[3] wiping atexit<<< 33932 1726882880.91984: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.cp437<<< 33932 1726882880.91987: stdout chunk (state=3): >>> # cleanup[3] wiping _blake2 <<< 33932 1726882880.91989: stdout chunk (state=3): >>># cleanup[3] wiping _hashlib <<< 33932 1726882880.91991: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 33932 1726882880.92020: stdout chunk (state=3): >>># cleanup[3] wiping math <<< 33932 1726882880.92023: stdout chunk (state=3): >>># cleanup[3] wiping shutil <<< 33932 1726882880.92045: stdout chunk (state=3): >>># destroy fnmatch <<< 33932 1726882880.92049: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd <<< 33932 1726882880.92051: stdout chunk (state=3): >>># cleanup[3] wiping _lzma <<< 33932 1726882880.92053: stdout chunk (state=3): >>># cleanup[3] wiping threading <<< 33932 1726882880.92055: stdout chunk (state=3): >>># cleanup[3] wiping zlib <<< 33932 1726882880.92057: stdout chunk (state=3): >>># cleanup[3] wiping errno <<< 33932 1726882880.92059: stdout chunk (state=3): >>># cleanup[3] wiping weakref <<< 33932 1726882880.92060: stdout chunk (state=3): >>># cleanup[3] wiping contextlib <<< 33932 1726882880.92062: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 33932 1726882880.92067: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools<<< 33932 1726882880.92144: stdout chunk (state=3): >>> # cleanup[3] wiping collections<<< 33932 1726882880.92146: stdout chunk (state=3): >>> # destroy _collections_abc <<< 33932 1726882880.92207: stdout chunk (state=3): >>># destroy heapq <<< 33932 1726882880.92211: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections <<< 33932 1726882880.92213: stdout chunk (state=3): >>># destroy _collections <<< 33932 1726882880.92215: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 33932 1726882880.92217: stdout chunk (state=3): >>># cleanup[3] wiping os.path <<< 33932 1726882880.92219: stdout chunk (state=3): >>># destroy genericpath <<< 33932 1726882880.92221: stdout chunk (state=3): >>># cleanup[3] wiping posixpath <<< 33932 1726882880.92223: stdout chunk (state=3): >>># cleanup[3] wiping stat <<< 33932 1726882880.92233: stdout chunk (state=3): >>># cleanup[3] wiping _stat <<< 33932 1726882880.92241: stdout chunk (state=3): >>># destroy _stat<<< 33932 1726882880.92251: stdout chunk (state=3): >>> # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8<<< 33932 1726882880.92265: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 33932 1726882880.92270: stdout chunk (state=3): >>> # destroy unicodedata # destroy gc # destroy termios<<< 33932 1726882880.92272: stdout chunk (state=3): >>> # destroy _ssl<<< 33932 1726882880.92275: stdout chunk (state=3): >>> # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2<<< 33932 1726882880.92423: stdout chunk (state=3): >>> # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid <<< 33932 1726882880.92426: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse <<< 33932 1726882880.92429: stdout chunk (state=3): >>># destroy tokenize <<< 33932 1726882880.92482: stdout chunk (state=3): >>># destroy _heapq <<< 33932 1726882880.92486: stdout chunk (state=3): >>># destroy posixpath <<< 33932 1726882880.92489: stdout chunk (state=3): >>># destroy stat <<< 33932 1726882880.92559: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno <<< 33932 1726882880.92562: stdout chunk (state=3): >>># destroy signal # destroy contextlib # destroy pwd<<< 33932 1726882880.92568: stdout chunk (state=3): >>> # destroy grp <<< 33932 1726882880.92570: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy selectors <<< 33932 1726882880.92691: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error<<< 33932 1726882880.92701: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.request <<< 33932 1726882880.92704: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 33932 1726882880.92706: stdout chunk (state=3): >>># destroy functools # destroy itertools <<< 33932 1726882880.92709: stdout chunk (state=3): >>># destroy operator # destroy ansible.module_utils.six.moves<<< 33932 1726882880.92711: stdout chunk (state=3): >>> # destroy _operator<<< 33932 1726882880.92713: stdout chunk (state=3): >>> <<< 33932 1726882880.92768: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp <<< 33932 1726882880.92847: stdout chunk (state=3): >>># destroy io # destroy marshal <<< 33932 1726882880.92859: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 33932 1726882880.93392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882880.93409: stderr chunk (state=3): >>><<< 33932 1726882880.93412: stdout chunk (state=3): >>><<< 33932 1726882880.93644: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa8d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa8d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa8d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa834190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa834220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa857850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa834940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa895880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa82dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa857d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa87d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d6f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5d3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa4b9dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b98b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b9eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b9f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4b9e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5aed30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5a7610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5dae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa4cbc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5ae250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa5bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5e09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cbfa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cbd90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cbd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa49e370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa49e460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4d3fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cda30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cd490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3d21c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa489c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cdeb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa5e0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3e4af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3e4e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3f6730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3f6c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa38e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3e4f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa39f280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3f65b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa39f340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cb9d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3ba6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3ba970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3ba760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3ba850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3baca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa3c71f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3ba8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3aea30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa4cb5b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa3baa90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f85aa2e4670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1787f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa209760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa209400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa2097c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1e27c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1e2b50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1e29a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9bc74f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa202d30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa209520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa202190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa233a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1d6190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1d6790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9bcdd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1d66a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa257d30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1599a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa262e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1690d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa262e20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa269220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa169100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa22db80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa262ac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa262d00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa2e4820> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1650d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa15b370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa165d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1656a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa166130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1a18b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1a6910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97c96a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa1e07f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97ced90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85aa1950a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9795070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa19d160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85aa19acd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97cebb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9549a60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97a86d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a97a8af0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a978f250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a978fa30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97de460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97de910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a97dbd00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97dbd60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97db250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a95b1f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97f34c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a97de310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94c9ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94c9fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a94b5370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a950cbb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a9444160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94442b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yat3d4jx/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f85a9495eb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94951f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a92adc70> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a9495100> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a94445e0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a92a42e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f85a943f8b0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1038, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234647552, "block_size": 4096, "block_total": 65519355, "block_available": 64510412, "block_used": 1008943, "inode_total": 131071472, "inode_available": 130998693, "inode_used": 72779, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "20", "epoch": "1726882880", "epoch_int": "1726882880", "date": "2024-09-20", "time": "21:41:20", "iso8601_micro": "2024-09-21T01:41:20.862153Z", "iso8601": "2024-09-21T01:41:20Z", "iso8601_basic": "20240920T214120862153", "iso8601_basic_short": "20240920T214120", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.51, "5m": 0.52, "15m": 0.32}, "ansible_lsb": {}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 33932 1726882880.94538: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882880.94542: _low_level_execute_command(): starting 33932 1726882880.94544: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882879.6009996-33970-176995712533830/ > /dev/null 2>&1 && sleep 0' 33932 1726882880.94858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882880.94879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.94895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.94913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.94971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.94987: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882880.95003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.95021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882880.95034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882880.95055: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882880.95081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882880.95123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882880.95140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882880.95151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882880.95176: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882880.95195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882880.95281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882880.95306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882880.95321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882880.95457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882880.97687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882880.97738: stderr chunk (state=3): >>><<< 33932 1726882880.97741: stdout chunk (state=3): >>><<< 33932 1726882880.98169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882880.98175: handler run complete 33932 1726882880.98181: variable 'ansible_facts' from source: unknown 33932 1726882880.98183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882880.98444: variable 'ansible_facts' from source: unknown 33932 1726882880.98904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882880.99054: attempt loop complete, returning result 33932 1726882880.99070: _execute() done 33932 1726882880.99080: dumping result to json 33932 1726882880.99116: done dumping result, returning 33932 1726882880.99135: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-615b-5c48-0000000000af] 33932 1726882880.99160: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000af ok: [managed_node1] 33932 1726882881.00292: no more pending results, returning what we have 33932 1726882881.00295: results queue empty 33932 1726882881.00295: checking for any_errors_fatal 33932 1726882881.00297: done checking for any_errors_fatal 33932 1726882881.00297: checking for max_fail_percentage 33932 1726882881.00299: done checking for max_fail_percentage 33932 1726882881.00299: checking to see if all hosts have failed and the running result is not ok 33932 1726882881.00300: done checking to see if all hosts have failed 33932 1726882881.00301: getting the remaining hosts for this loop 33932 1726882881.00302: done getting the remaining hosts for this loop 33932 1726882881.00305: getting the next task for host managed_node1 33932 1726882881.00310: done getting next task for host managed_node1 33932 1726882881.00311: ^ task is: TASK: meta (flush_handlers) 33932 1726882881.00313: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882881.00316: getting variables 33932 1726882881.00317: in VariableManager get_vars() 33932 1726882881.00338: Calling all_inventory to load vars for managed_node1 33932 1726882881.00340: Calling groups_inventory to load vars for managed_node1 33932 1726882881.00343: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882881.00353: Calling all_plugins_play to load vars for managed_node1 33932 1726882881.00366: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882881.00372: Calling groups_plugins_play to load vars for managed_node1 33932 1726882881.00584: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000af 33932 1726882881.00588: WORKER PROCESS EXITING 33932 1726882881.00635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.01375: done with get_vars() 33932 1726882881.01386: done getting variables 33932 1726882881.01452: in VariableManager get_vars() 33932 1726882881.01462: Calling all_inventory to load vars for managed_node1 33932 1726882881.01472: Calling groups_inventory to load vars for managed_node1 33932 1726882881.01476: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882881.01482: Calling all_plugins_play to load vars for managed_node1 33932 1726882881.01484: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882881.01492: Calling groups_plugins_play to load vars for managed_node1 33932 1726882881.01928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.02303: done with get_vars() 33932 1726882881.02317: done queuing things up, now waiting for results queue to drain 33932 1726882881.02319: results queue empty 33932 1726882881.02319: checking for any_errors_fatal 33932 1726882881.02321: done checking for any_errors_fatal 33932 1726882881.02322: checking for max_fail_percentage 33932 1726882881.02323: done checking for max_fail_percentage 33932 1726882881.02324: checking to see if all hosts have failed and the running result is not ok 33932 1726882881.02324: done checking to see if all hosts have failed 33932 1726882881.02325: getting the remaining hosts for this loop 33932 1726882881.02326: done getting the remaining hosts for this loop 33932 1726882881.02328: getting the next task for host managed_node1 33932 1726882881.02332: done getting next task for host managed_node1 33932 1726882881.02334: ^ task is: TASK: Include the task 'el_repo_setup.yml' 33932 1726882881.02336: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882881.02338: getting variables 33932 1726882881.02343: in VariableManager get_vars() 33932 1726882881.02351: Calling all_inventory to load vars for managed_node1 33932 1726882881.02353: Calling groups_inventory to load vars for managed_node1 33932 1726882881.02355: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882881.02359: Calling all_plugins_play to load vars for managed_node1 33932 1726882881.02361: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882881.02366: Calling groups_plugins_play to load vars for managed_node1 33932 1726882881.02505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.02706: done with get_vars() 33932 1726882881.02714: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:11 Friday 20 September 2024 21:41:21 -0400 (0:00:01.483) 0:00:01.495 ****** 33932 1726882881.02789: entering _queue_task() for managed_node1/include_tasks 33932 1726882881.02791: Creating lock for include_tasks 33932 1726882881.03068: worker is 1 (out of 1 available) 33932 1726882881.03082: exiting _queue_task() for managed_node1/include_tasks 33932 1726882881.03093: done queuing things up, now waiting for results queue to drain 33932 1726882881.03095: waiting for pending results... 33932 1726882881.03363: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 33932 1726882881.03484: in run() - task 0e448fcc-3ce9-615b-5c48-000000000006 33932 1726882881.03502: variable 'ansible_search_path' from source: unknown 33932 1726882881.03548: calling self._execute() 33932 1726882881.03620: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882881.03630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882881.03642: variable 'omit' from source: magic vars 33932 1726882881.03756: _execute() done 33932 1726882881.03772: dumping result to json 33932 1726882881.03780: done dumping result, returning 33932 1726882881.03790: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-615b-5c48-000000000006] 33932 1726882881.03801: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000006 33932 1726882881.03935: no more pending results, returning what we have 33932 1726882881.03941: in VariableManager get_vars() 33932 1726882881.03976: Calling all_inventory to load vars for managed_node1 33932 1726882881.03979: Calling groups_inventory to load vars for managed_node1 33932 1726882881.03983: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882881.03997: Calling all_plugins_play to load vars for managed_node1 33932 1726882881.04003: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882881.04007: Calling groups_plugins_play to load vars for managed_node1 33932 1726882881.04221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.04884: done with get_vars() 33932 1726882881.04909: variable 'ansible_search_path' from source: unknown 33932 1726882881.04959: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000006 33932 1726882881.04963: WORKER PROCESS EXITING 33932 1726882881.04994: we have included files to process 33932 1726882881.04995: generating all_blocks data 33932 1726882881.05015: done generating all_blocks data 33932 1726882881.05017: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33932 1726882881.05018: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33932 1726882881.05021: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33932 1726882881.06887: in VariableManager get_vars() 33932 1726882881.06904: done with get_vars() 33932 1726882881.06939: done processing included file 33932 1726882881.06941: iterating over new_blocks loaded from include file 33932 1726882881.06943: in VariableManager get_vars() 33932 1726882881.06955: done with get_vars() 33932 1726882881.06956: filtering new block on tags 33932 1726882881.06973: done filtering new block on tags 33932 1726882881.06980: in VariableManager get_vars() 33932 1726882881.06995: done with get_vars() 33932 1726882881.06997: filtering new block on tags 33932 1726882881.07035: done filtering new block on tags 33932 1726882881.07038: in VariableManager get_vars() 33932 1726882881.07234: done with get_vars() 33932 1726882881.07236: filtering new block on tags 33932 1726882881.07271: done filtering new block on tags 33932 1726882881.07274: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 33932 1726882881.07279: extending task lists for all hosts with included blocks 33932 1726882881.07333: done extending task lists 33932 1726882881.07334: done processing included files 33932 1726882881.07335: results queue empty 33932 1726882881.07335: checking for any_errors_fatal 33932 1726882881.07337: done checking for any_errors_fatal 33932 1726882881.07337: checking for max_fail_percentage 33932 1726882881.07338: done checking for max_fail_percentage 33932 1726882881.07339: checking to see if all hosts have failed and the running result is not ok 33932 1726882881.07340: done checking to see if all hosts have failed 33932 1726882881.07340: getting the remaining hosts for this loop 33932 1726882881.07342: done getting the remaining hosts for this loop 33932 1726882881.07344: getting the next task for host managed_node1 33932 1726882881.07347: done getting next task for host managed_node1 33932 1726882881.07349: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 33932 1726882881.07352: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882881.07353: getting variables 33932 1726882881.07354: in VariableManager get_vars() 33932 1726882881.07362: Calling all_inventory to load vars for managed_node1 33932 1726882881.07366: Calling groups_inventory to load vars for managed_node1 33932 1726882881.07368: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882881.07373: Calling all_plugins_play to load vars for managed_node1 33932 1726882881.07375: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882881.07378: Calling groups_plugins_play to load vars for managed_node1 33932 1726882881.07548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.07752: done with get_vars() 33932 1726882881.07760: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:41:21 -0400 (0:00:00.050) 0:00:01.545 ****** 33932 1726882881.07827: entering _queue_task() for managed_node1/setup 33932 1726882881.08425: worker is 1 (out of 1 available) 33932 1726882881.08437: exiting _queue_task() for managed_node1/setup 33932 1726882881.08513: done queuing things up, now waiting for results queue to drain 33932 1726882881.08515: waiting for pending results... 33932 1726882881.08810: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 33932 1726882881.08946: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000c0 33932 1726882881.08990: variable 'ansible_search_path' from source: unknown 33932 1726882881.08998: variable 'ansible_search_path' from source: unknown 33932 1726882881.09039: calling self._execute() 33932 1726882881.09125: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882881.09135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882881.09147: variable 'omit' from source: magic vars 33932 1726882881.10265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882881.15781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882881.15879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882881.15922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882881.15973: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882881.16006: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882881.16138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882881.16346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882881.16395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882881.16520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882881.16540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882881.16981: variable 'ansible_facts' from source: unknown 33932 1726882881.17142: variable 'network_test_required_facts' from source: task vars 33932 1726882881.17195: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 33932 1726882881.17207: variable 'omit' from source: magic vars 33932 1726882881.17253: variable 'omit' from source: magic vars 33932 1726882881.17302: variable 'omit' from source: magic vars 33932 1726882881.17333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882881.17372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882881.17401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882881.17423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882881.17439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882881.17485: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882881.17498: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882881.17506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882881.17622: Set connection var ansible_shell_executable to /bin/sh 33932 1726882881.17636: Set connection var ansible_timeout to 10 33932 1726882881.17645: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882881.17654: Set connection var ansible_pipelining to False 33932 1726882881.17660: Set connection var ansible_connection to ssh 33932 1726882881.17676: Set connection var ansible_shell_type to sh 33932 1726882881.17714: variable 'ansible_shell_executable' from source: unknown 33932 1726882881.17722: variable 'ansible_connection' from source: unknown 33932 1726882881.17730: variable 'ansible_module_compression' from source: unknown 33932 1726882881.17737: variable 'ansible_shell_type' from source: unknown 33932 1726882881.17742: variable 'ansible_shell_executable' from source: unknown 33932 1726882881.17748: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882881.17755: variable 'ansible_pipelining' from source: unknown 33932 1726882881.17760: variable 'ansible_timeout' from source: unknown 33932 1726882881.17772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882881.17934: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882881.17949: variable 'omit' from source: magic vars 33932 1726882881.17957: starting attempt loop 33932 1726882881.17964: running the handler 33932 1726882881.17983: _low_level_execute_command(): starting 33932 1726882881.18000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882881.18735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882881.18762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882881.18799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.18831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.18877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882881.18894: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882881.18909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.18932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882881.18945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882881.18956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882881.18973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882881.18992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.19009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.19022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882881.19040: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882881.19055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.19141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882881.19161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882881.19181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882881.19318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882881.20935: stdout chunk (state=3): >>>/root <<< 33932 1726882881.21146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882881.21150: stdout chunk (state=3): >>><<< 33932 1726882881.21152: stderr chunk (state=3): >>><<< 33932 1726882881.21273: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882881.21277: _low_level_execute_command(): starting 33932 1726882881.21280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060 `" && echo ansible-tmp-1726882881.2117631-34032-45417688756060="` echo /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060 `" ) && sleep 0' 33932 1726882881.21992: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882881.22008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882881.22023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.22045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.22380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882881.22666: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882881.22999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.23019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882881.23031: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882881.23041: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882881.23052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882881.23071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.23096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.23116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882881.23127: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882881.23140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.23242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882881.23287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882881.23304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882881.23446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882881.25328: stdout chunk (state=3): >>>ansible-tmp-1726882881.2117631-34032-45417688756060=/root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060 <<< 33932 1726882881.25514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882881.25517: stdout chunk (state=3): >>><<< 33932 1726882881.25519: stderr chunk (state=3): >>><<< 33932 1726882881.25777: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882881.2117631-34032-45417688756060=/root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882881.25780: variable 'ansible_module_compression' from source: unknown 33932 1726882881.25783: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 33932 1726882881.25785: variable 'ansible_facts' from source: unknown 33932 1726882881.25915: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060/AnsiballZ_setup.py 33932 1726882881.26592: Sending initial data 33932 1726882881.26595: Sent initial data (153 bytes) 33932 1726882881.28633: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.28638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.28658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882881.28661: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.28738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882881.28743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882881.28745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882881.28839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882881.30560: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882881.30652: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882881.30756: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpq_yty1p9 /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060/AnsiballZ_setup.py <<< 33932 1726882881.30844: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882881.33202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882881.33409: stderr chunk (state=3): >>><<< 33932 1726882881.33413: stdout chunk (state=3): >>><<< 33932 1726882881.33434: done transferring module to remote 33932 1726882881.33439: _low_level_execute_command(): starting 33932 1726882881.33444: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060/ /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060/AnsiballZ_setup.py && sleep 0' 33932 1726882881.34522: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882881.34553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882881.34888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.34913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.35049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882881.35066: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882881.35082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.35099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882881.35109: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882881.35132: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882881.35226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882881.35230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.35331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882881.35343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882881.35462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882881.37387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882881.37430: stderr chunk (state=3): >>><<< 33932 1726882881.37433: stdout chunk (state=3): >>><<< 33932 1726882881.37446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882881.37448: _low_level_execute_command(): starting 33932 1726882881.37453: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060/AnsiballZ_setup.py && sleep 0' 33932 1726882881.37885: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882881.37888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.37926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.37929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882881.37931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.37935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.37979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882881.37994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882881.38005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882881.38115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882881.40532: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 33932 1726882881.40627: stdout chunk (state=3): >>>import '_io' # <<< 33932 1726882881.40629: stdout chunk (state=3): >>>import 'marshal' # <<< 33932 1726882881.40662: stdout chunk (state=3): >>>import 'posix' # <<< 33932 1726882881.40706: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 33932 1726882881.40722: stdout chunk (state=3): >>># installing zipimport hook <<< 33932 1726882881.40766: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 33932 1726882881.40850: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.40872: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 33932 1726882881.40897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 33932 1726882881.40914: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1f3dc0> <<< 33932 1726882881.40959: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 33932 1726882881.40987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 33932 1726882881.40992: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1f3b20> <<< 33932 1726882881.41019: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 33932 1726882881.41041: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1f3ac0> <<< 33932 1726882881.41091: stdout chunk (state=3): >>>import '_signal' # <<< 33932 1726882881.41151: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198490> <<< 33932 1726882881.41189: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 33932 1726882881.41198: stdout chunk (state=3): >>>import '_abc' # <<< 33932 1726882881.41203: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198940> <<< 33932 1726882881.41221: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198670> <<< 33932 1726882881.41262: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 33932 1726882881.41277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 33932 1726882881.41298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 33932 1726882881.41344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 33932 1726882881.41375: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 33932 1726882881.41397: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a14f190> <<< 33932 1726882881.41412: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 33932 1726882881.41441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 33932 1726882881.41621: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a14f220> <<< 33932 1726882881.41652: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 33932 1726882881.41667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 33932 1726882881.41692: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 33932 1726882881.41698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a172850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a14f940> <<< 33932 1726882881.41727: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1b0880> <<< 33932 1726882881.41753: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 33932 1726882881.41767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 33932 1726882881.41778: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a147d90> <<< 33932 1726882881.41832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 33932 1726882881.41842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 33932 1726882881.41844: stdout chunk (state=3): >>>import '_locale' # <<< 33932 1726882881.41849: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a172d90> <<< 33932 1726882881.41918: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198970> <<< 33932 1726882881.41962: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33932 1726882881.42506: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 33932 1726882881.42538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 33932 1726882881.42578: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 33932 1726882881.42593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 33932 1726882881.42611: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 33932 1726882881.42642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 33932 1726882881.42661: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0eeeb0> <<< 33932 1726882881.42712: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0f1f40> <<< 33932 1726882881.42747: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 33932 1726882881.42771: stdout chunk (state=3): >>>import '_sre' # <<< 33932 1726882881.42784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 33932 1726882881.42804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 33932 1726882881.42829: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 33932 1726882881.42833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 33932 1726882881.42874: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0e7610> <<< 33932 1726882881.42894: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0ed640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0ee370> <<< 33932 1726882881.42918: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 33932 1726882881.43031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 33932 1726882881.43043: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 33932 1726882881.43081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.43106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 33932 1726882881.43112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 33932 1726882881.43140: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.43147: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39d90e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d90910> <<< 33932 1726882881.43169: stdout chunk (state=3): >>>import 'itertools' # <<< 33932 1726882881.43193: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 33932 1726882881.43199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d90f10> <<< 33932 1726882881.43218: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 33932 1726882881.43240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 33932 1726882881.43272: stdout chunk (state=3): >>>import '_operator' # <<< 33932 1726882881.43279: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d90fd0> <<< 33932 1726882881.43314: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 33932 1726882881.43317: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da30d0> <<< 33932 1726882881.43322: stdout chunk (state=3): >>>import '_collections' # <<< 33932 1726882881.43393: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0c9d90> <<< 33932 1726882881.43400: stdout chunk (state=3): >>>import '_functools' # <<< 33932 1726882881.43431: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0c2670> <<< 33932 1726882881.43498: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 33932 1726882881.43516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0d56d0> <<< 33932 1726882881.43532: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0f5e20> <<< 33932 1726882881.43541: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 33932 1726882881.43574: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.43579: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39da3cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0c92b0> <<< 33932 1726882881.43617: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.43636: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3a0d52e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0fb9d0> <<< 33932 1726882881.43661: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 33932 1726882881.43666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 33932 1726882881.43695: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.43730: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 33932 1726882881.43748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc'<<< 33932 1726882881.43751: stdout chunk (state=3): >>> import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3eb0> <<< 33932 1726882881.43760: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3df0> <<< 33932 1726882881.43789: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 33932 1726882881.43809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3d60> <<< 33932 1726882881.43820: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 33932 1726882881.43853: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 33932 1726882881.43864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 33932 1726882881.43888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 33932 1726882881.44001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 33932 1726882881.44007: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d763d0> <<< 33932 1726882881.44015: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 33932 1726882881.44033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 33932 1726882881.44072: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d764c0> <<< 33932 1726882881.44262: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39daaf40> <<< 33932 1726882881.44303: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da5a90> <<< 33932 1726882881.44316: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da5490> <<< 33932 1726882881.44339: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 33932 1726882881.44355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 33932 1726882881.44392: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 33932 1726882881.44417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 33932 1726882881.44438: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 33932 1726882881.44450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 33932 1726882881.44455: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cc4220> <<< 33932 1726882881.44496: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d61520> <<< 33932 1726882881.44571: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da5f10> <<< 33932 1726882881.44578: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0fb040> <<< 33932 1726882881.44600: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 33932 1726882881.44658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 33932 1726882881.44661: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 33932 1726882881.44675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cd6b50> <<< 33932 1726882881.44683: stdout chunk (state=3): >>>import 'errno' # <<< 33932 1726882881.44718: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cd6e80> <<< 33932 1726882881.44742: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 33932 1726882881.44747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 33932 1726882881.44787: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 33932 1726882881.44797: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ce7790> <<< 33932 1726882881.44840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py<<< 33932 1726882881.44851: stdout chunk (state=3): >>> <<< 33932 1726882881.44863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 33932 1726882881.44907: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ce7cd0> <<< 33932 1726882881.44969: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39c80400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cd6f70> <<< 33932 1726882881.44985: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 33932 1726882881.45044: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39c912e0> <<< 33932 1726882881.45050: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ce7610> <<< 33932 1726882881.45057: stdout chunk (state=3): >>>import 'pwd' # <<< 33932 1726882881.45085: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39c913a0> <<< 33932 1726882881.45136: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3a30> <<< 33932 1726882881.45162: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 33932 1726882881.45201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 33932 1726882881.45209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 33932 1726882881.45225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 33932 1726882881.45292: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.45299: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cac700> <<< 33932 1726882881.45308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 33932 1726882881.45330: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cac9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cac7c0> <<< 33932 1726882881.45362: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.45373: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cac8b0> <<< 33932 1726882881.45384: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 33932 1726882881.45390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 33932 1726882881.45676: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.45692: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cacd00> <<< 33932 1726882881.45702: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cb7250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cac940> <<< 33932 1726882881.45722: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ca0a90> <<< 33932 1726882881.45755: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3610> <<< 33932 1726882881.45789: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 33932 1726882881.45871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc'<<< 33932 1726882881.45885: stdout chunk (state=3): >>> <<< 33932 1726882881.45913: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cacaf0> <<< 33932 1726882881.46122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 33932 1726882881.46131: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fae39be06d0> <<< 33932 1726882881.46528: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip' # zipimport: zlib available <<< 33932 1726882881.46769: stdout chunk (state=3): >>># zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 33932 1726882881.46773: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 33932 1726882881.46780: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.48681: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.50183: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 33932 1726882881.50195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39543820> <<< 33932 1726882881.50204: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.50232: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 33932 1726882881.50257: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 33932 1726882881.50291: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.50296: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395d2730> <<< 33932 1726882881.50352: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2610> <<< 33932 1726882881.50392: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2340> <<< 33932 1726882881.50413: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 33932 1726882881.50481: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2460> <<< 33932 1726882881.50486: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2160> import 'atexit' # <<< 33932 1726882881.50511: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.50518: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395d23a0> <<< 33932 1726882881.50527: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 33932 1726882881.50579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 33932 1726882881.50623: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2790> <<< 33932 1726882881.50647: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 33932 1726882881.50666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 33932 1726882881.50691: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 33932 1726882881.50709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 33932 1726882881.50737: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 33932 1726882881.50877: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395c2820> <<< 33932 1726882881.50917: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.50922: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395c2490> <<< 33932 1726882881.50942: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.50950: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395c2640> <<< 33932 1726882881.50972: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 33932 1726882881.50980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 33932 1726882881.51020: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae394c8520> <<< 33932 1726882881.51044: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395cdd60> <<< 33932 1726882881.51314: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d24f0> <<< 33932 1726882881.51340: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 33932 1726882881.51348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 33932 1726882881.51366: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395cd1c0> <<< 33932 1726882881.51393: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 33932 1726882881.51395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 33932 1726882881.51429: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 33932 1726882881.51445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 33932 1726882881.51449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 33932 1726882881.51467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 33932 1726882881.51498: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 33932 1726882881.51501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 33932 1726882881.51503: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d1b20> <<< 33932 1726882881.51632: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395a1160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395a1760> <<< 33932 1726882881.51638: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae394ced30> <<< 33932 1726882881.51679: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395a1670> <<< 33932 1726882881.51708: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b54d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 33932 1726882881.51729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 33932 1726882881.51752: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 33932 1726882881.51790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 33932 1726882881.51876: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39524a00> <<< 33932 1726882881.51908: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b5ee80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 33932 1726882881.51921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 33932 1726882881.52052: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395330a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b5eeb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.52091: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 33932 1726882881.52103: stdout chunk (state=3): >>>import '_string' # <<< 33932 1726882881.52185: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b66250> <<< 33932 1726882881.52391: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395330d0> <<< 33932 1726882881.52515: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39b66a60> <<< 33932 1726882881.52551: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395f7b80> <<< 33932 1726882881.52609: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.52625: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39b5ecd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b54ee0> <<< 33932 1726882881.52653: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 33932 1726882881.52676: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 33932 1726882881.52689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 33932 1726882881.52733: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3952f0d0> <<< 33932 1726882881.53037: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39526310> <<< 33932 1726882881.53040: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3952fcd0> <<< 33932 1726882881.53078: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3952f670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39530100> <<< 33932 1726882881.53108: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 33932 1726882881.53120: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.53231: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.53351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 33932 1726882881.53383: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33932 1726882881.53396: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 33932 1726882881.53551: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.53689: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.54421: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.55176: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 33932 1726882881.55191: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 33932 1726882881.55202: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 33932 1726882881.55206: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 33932 1726882881.55214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.55283: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.55288: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3956d910> <<< 33932 1726882881.55370: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 33932 1726882881.55381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 33932 1726882881.55385: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395729a0> <<< 33932 1726882881.55391: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390c6640> <<< 33932 1726882881.55437: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 33932 1726882881.55453: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.55476: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.55491: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 33932 1726882881.55496: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.55686: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.55889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 33932 1726882881.55915: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395a97f0> <<< 33932 1726882881.55918: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.56546: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57134: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57215: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57307: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 33932 1726882881.57310: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57348: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57395: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 33932 1726882881.57398: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57492: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57593: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 33932 1726882881.57611: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57618: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 33932 1726882881.57632: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57675: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.57727: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 33932 1726882881.57733: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58022: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58316: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 33932 1726882881.58352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 33932 1726882881.58366: stdout chunk (state=3): >>>import '_ast' # <<< 33932 1726882881.58463: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395ef460> <<< 33932 1726882881.58468: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58558: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58649: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py<<< 33932 1726882881.58653: stdout chunk (state=3): >>> import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 33932 1726882881.58663: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 33932 1726882881.58678: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58737: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58773: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 33932 1726882881.58780: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58838: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.58881: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.59004: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.59090: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 33932 1726882881.59123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.59225: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395610d0> <<< 33932 1726882881.59349: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395721f0> <<< 33932 1726882881.59399: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 33932 1726882881.59402: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.59482: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.59551: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.59589: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.59630: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 33932 1726882881.59642: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 33932 1726882881.59668: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 33932 1726882881.59709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 33932 1726882881.59730: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 33932 1726882881.59762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 33932 1726882881.59895: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39574bb0> <<< 33932 1726882881.59944: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b6f070> <<< 33932 1726882881.60023: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395642e0> <<< 33932 1726882881.60029: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 33932 1726882881.60060: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60105: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py <<< 33932 1726882881.60108: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 33932 1726882881.60246: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 33932 1726882881.60255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 33932 1726882881.60322: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60405: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60419: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60447: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60486: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60534: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60581: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60615: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 33932 1726882881.60627: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60728: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60893: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.60905: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 33932 1726882881.61126: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.61344: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.61390: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.61450: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 33932 1726882881.61456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882881.61483: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 33932 1726882881.61489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 33932 1726882881.61512: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 33932 1726882881.61515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 33932 1726882881.61554: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39079400> <<< 33932 1726882881.61584: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 33932 1726882881.61589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 33932 1726882881.61615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 33932 1726882881.61648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 33932 1726882881.61682: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 33932 1726882881.61684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 33932 1726882881.61704: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390d89a0> <<< 33932 1726882881.61748: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.61756: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae390d8df0> <<< 33932 1726882881.61841: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390d6490> <<< 33932 1726882881.61866: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38f51040> <<< 33932 1726882881.61889: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e413a0> <<< 33932 1726882881.61903: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e415e0> <<< 33932 1726882881.61917: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 33932 1726882881.61945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 33932 1726882881.61977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 33932 1726882881.61980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 33932 1726882881.62017: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.62021: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395606d0> <<< 33932 1726882881.62025: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390e6730> <<< 33932 1726882881.62048: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 33932 1726882881.62060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 33932 1726882881.62093: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395605e0> <<< 33932 1726882881.62108: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 33932 1726882881.62140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 33932 1726882881.62177: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.62183: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39088c70> <<< 33932 1726882881.62205: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38ea09a0> <<< 33932 1726882881.62238: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e414f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 33932 1726882881.62245: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 33932 1726882881.62267: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62280: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 33932 1726882881.62298: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62377: stdout chunk (state=3): >>># zipimport: zlib available<<< 33932 1726882881.62382: stdout chunk (state=3): >>> <<< 33932 1726882881.62440: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 33932 1726882881.62443: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62499: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62577: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 33932 1726882881.62582: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62585: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 33932 1726882881.62601: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62630: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62673: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 33932 1726882881.62679: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62749: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62801: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 33932 1726882881.62804: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62865: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.62901: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 33932 1726882881.62988: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.63046: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.63112: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.63191: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 33932 1726882881.63196: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 33932 1726882881.63838: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64419: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 33932 1726882881.64428: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64491: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64558: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64591: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64634: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 33932 1726882881.64639: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64670: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64704: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 33932 1726882881.64707: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64780: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64843: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 33932 1726882881.64850: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64881: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64914: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 33932 1726882881.64918: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64956: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.64990: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 33932 1726882881.64998: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.65098: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.65208: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 33932 1726882881.65219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 33932 1726882881.65257: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e419d0> <<< 33932 1726882881.65286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 33932 1726882881.65324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 33932 1726882881.65607: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38dc0f40> <<< 33932 1726882881.65615: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 33932 1726882881.65629: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.65717: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.65795: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 33932 1726882881.65810: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.65953: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.66052: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 33932 1726882881.66081: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.66200: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.66309: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 33932 1726882881.66324: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.66391: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.66456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 33932 1726882881.66488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 33932 1726882881.66724: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.66730: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882881.66735: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae38db83a0> <<< 33932 1726882881.67161: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e06100> <<< 33932 1726882881.67176: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 33932 1726882881.67255: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.67309: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 33932 1726882881.67347: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.67444: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.67552: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.67700: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.67902: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 33932 1726882881.67915: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.67949: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68012: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 33932 1726882881.68019: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68076: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68182: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 33932 1726882881.68201: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae38d4c6a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38d4ca90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 33932 1726882881.68257: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 33932 1726882881.68410: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 33932 1726882881.68462: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68602: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 33932 1726882881.68619: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68692: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68768: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68821: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68842: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 33932 1726882881.68933: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.68956: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.69088: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.69203: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 33932 1726882881.69207: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.69304: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.69429: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 33932 1726882881.69475: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.69492: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.70141: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.70894: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 33932 1726882881.70901: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 33932 1726882881.70909: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71042: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71176: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 33932 1726882881.71188: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71307: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71438: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 33932 1726882881.71443: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71647: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71840: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 33932 1726882881.71859: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71881: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71885: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 33932 1726882881.71905: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.71949: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72007: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 33932 1726882881.72011: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72139: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72322: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72422: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72609: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 33932 1726882881.72612: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72633: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72655: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 33932 1726882881.72679: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72697: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72717: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 33932 1726882881.72782: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72836: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 33932 1726882881.72858: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72886: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72898: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 33932 1726882881.72940: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.72999: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 33932 1726882881.73060: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73105: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 33932 1726882881.73123: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73332: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73542: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 33932 1726882881.73591: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73656: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 33932 1726882881.73694: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73721: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 33932 1726882881.73729: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73752: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73785: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 33932 1726882881.73792: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73816: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73870: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 33932 1726882881.73873: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.73953: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74075: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 33932 1726882881.74085: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 33932 1726882881.74131: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 33932 1726882881.74141: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74173: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74195: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74246: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74386: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74403: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 33932 1726882881.74425: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74475: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 33932 1726882881.74483: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74638: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74803: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 33932 1726882881.74843: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74887: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 33932 1726882881.74894: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74933: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.74979: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 33932 1726882881.75054: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.75121: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 33932 1726882881.75125: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 33932 1726882881.75130: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.75204: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.75281: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 33932 1726882881.75288: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 33932 1726882881.75365: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882881.75495: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 33932 1726882881.75507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 33932 1726882881.75514: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 33932 1726882881.75540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 33932 1726882881.75580: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae38d2e0d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38d30880> <<< 33932 1726882881.75638: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38d30fd0> <<< 33932 1726882881.77896: stdout chunk (state=3): >>>import 'gc' # <<< 33932 1726882881.78492: stdout chunk (state=3): >>> <<< 33932 1726882881.78538: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "21", "epoch": "1726882881", "epoch_int": "1726882881", "date": "2024-09-20", "time": "21:41:21", "iso8601_micro": "2024-09-21T01:41:21.782158Z", "iso8601": "2024-09-21T01:41:21Z", "iso8601_basic": "20240920T214121782158", "iso8601_basic_short": "20240920T214121", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+an<<< 33932 1726882881.78545: stdout chunk (state=3): >>>JVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 33932 1726882881.79209: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 33932 1726882881.79215: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 33932 1726882881.79218: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections<<< 33932 1726882881.79225: stdout chunk (state=3): >>> # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external <<< 33932 1726882881.79228: stdout chunk (state=3): >>># cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib<<< 33932 1726882881.79234: stdout chunk (state=3): >>> # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp <<< 33932 1726882881.79240: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect <<< 33932 1726882881.79243: stdout chunk (state=3): >>># cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder<<< 33932 1726882881.79245: stdout chunk (state=3): >>> # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex<<< 33932 1726882881.79247: stdout chunk (state=3): >>> # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid <<< 33932 1726882881.79249: stdout chunk (state=3): >>># cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array <<< 33932 1726882881.79251: stdout chunk (state=3): >>># cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 33932 1726882881.79254: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 33932 1726882881.79256: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections <<< 33932 1726882881.79258: stdout chunk (state=3): >>># destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors<<< 33932 1726882881.79260: stdout chunk (state=3): >>> # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation <<< 33932 1726882881.79280: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout <<< 33932 1726882881.79317: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware <<< 33932 1726882881.79333: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 33932 1726882881.79661: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 33932 1726882881.79667: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport <<< 33932 1726882881.79672: stdout chunk (state=3): >>># destroy _compression <<< 33932 1726882881.79677: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 33932 1726882881.79691: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 33932 1726882881.80222: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 33932 1726882881.80352: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 33932 1726882881.80435: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath <<< 33932 1726882881.80448: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 33932 1726882881.80502: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 33932 1726882881.81075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882881.81147: stdout chunk (state=3): >>><<< 33932 1726882881.81249: stderr chunk (state=3): >>><<< 33932 1726882881.81614: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1f3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1f3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a14f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a14f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a172850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a14f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a1b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a147d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a172d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a198970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0eeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0f1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0e7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0ed640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0ee370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39d90e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d90910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d90f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d90fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da30d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0c9d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0c2670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0d56d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0f5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39da3cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0c92b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3a0d52e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0fb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d763d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d764c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39daaf40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da5a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da5490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cc4220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39d61520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da5f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3a0fb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cd6b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cd6e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ce7790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ce7cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39c80400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cd6f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39c912e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ce7610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39c913a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cac700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cac9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cac7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cac8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cacd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39cb7250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cac940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39ca0a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39da3610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39cacaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fae39be06d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39543820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395d2730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395d23a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d2790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395c2820> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395c2490> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395c2640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae394c8520> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395cdd60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d24f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395cd1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395d1b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395a1160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395a1760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae394ced30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395a1670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b54d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39524a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b5ee80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395330a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b5eeb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b66250> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395330d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39b66a60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395f7b80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39b5ecd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b54ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3952f0d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39526310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae3952fcd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3952f670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39530100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae3956d910> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395729a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390c6640> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395a97f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395ef460> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395610d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395721f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39574bb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39b6f070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395642e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae39079400> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390d89a0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae390d8df0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390d6490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38f51040> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e413a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e415e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae395606d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae390e6730> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae395605e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae39088c70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38ea09a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e414f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e419d0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38dc0f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae38db83a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38e06100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae38d4c6a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38d4ca90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_f09_7n_x/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fae38d2e0d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38d30880> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fae38d30fd0> import 'gc' # {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "21", "epoch": "1726882881", "epoch_int": "1726882881", "date": "2024-09-20", "time": "21:41:21", "iso8601_micro": "2024-09-21T01:41:21.782158Z", "iso8601": "2024-09-21T01:41:21Z", "iso8601_basic": "20240920T214121782158", "iso8601_basic_short": "20240920T214121", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 33932 1726882881.83918: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882881.83921: _low_level_execute_command(): starting 33932 1726882881.83923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882881.2117631-34032-45417688756060/ > /dev/null 2>&1 && sleep 0' 33932 1726882881.84577: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.84589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.84624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.84633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.84636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.84700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882881.84792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882881.85096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882881.86905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882881.86980: stderr chunk (state=3): >>><<< 33932 1726882881.86983: stdout chunk (state=3): >>><<< 33932 1726882881.87073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882881.87078: handler run complete 33932 1726882881.87080: variable 'ansible_facts' from source: unknown 33932 1726882881.87118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.87233: variable 'ansible_facts' from source: unknown 33932 1726882881.87280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.87423: attempt loop complete, returning result 33932 1726882881.87426: _execute() done 33932 1726882881.87429: dumping result to json 33932 1726882881.87431: done dumping result, returning 33932 1726882881.87433: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-615b-5c48-0000000000c0] 33932 1726882881.87435: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c0 33932 1726882881.87532: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c0 33932 1726882881.87535: WORKER PROCESS EXITING ok: [managed_node1] 33932 1726882881.87666: no more pending results, returning what we have 33932 1726882881.87669: results queue empty 33932 1726882881.87670: checking for any_errors_fatal 33932 1726882881.87671: done checking for any_errors_fatal 33932 1726882881.87672: checking for max_fail_percentage 33932 1726882881.87673: done checking for max_fail_percentage 33932 1726882881.87674: checking to see if all hosts have failed and the running result is not ok 33932 1726882881.87675: done checking to see if all hosts have failed 33932 1726882881.87675: getting the remaining hosts for this loop 33932 1726882881.87677: done getting the remaining hosts for this loop 33932 1726882881.87680: getting the next task for host managed_node1 33932 1726882881.87687: done getting next task for host managed_node1 33932 1726882881.87689: ^ task is: TASK: Check if system is ostree 33932 1726882881.87691: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882881.87694: getting variables 33932 1726882881.87695: in VariableManager get_vars() 33932 1726882881.87747: Calling all_inventory to load vars for managed_node1 33932 1726882881.87750: Calling groups_inventory to load vars for managed_node1 33932 1726882881.87754: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882881.88072: Calling all_plugins_play to load vars for managed_node1 33932 1726882881.88077: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882881.88081: Calling groups_plugins_play to load vars for managed_node1 33932 1726882881.88262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882881.88672: done with get_vars() 33932 1726882881.88684: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:41:21 -0400 (0:00:00.812) 0:00:02.358 ****** 33932 1726882881.89090: entering _queue_task() for managed_node1/stat 33932 1726882881.89900: worker is 1 (out of 1 available) 33932 1726882881.89913: exiting _queue_task() for managed_node1/stat 33932 1726882881.89933: done queuing things up, now waiting for results queue to drain 33932 1726882881.89935: waiting for pending results... 33932 1726882881.90739: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 33932 1726882881.90940: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000c2 33932 1726882881.91032: variable 'ansible_search_path' from source: unknown 33932 1726882881.91046: variable 'ansible_search_path' from source: unknown 33932 1726882881.91154: calling self._execute() 33932 1726882881.91347: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882881.91371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882881.91386: variable 'omit' from source: magic vars 33932 1726882881.92403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882881.93086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882881.93155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882881.93300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882881.93392: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882881.93514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882881.93677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882881.93709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882881.94542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882881.94900: Evaluated conditional (not __network_is_ostree is defined): True 33932 1726882881.94965: variable 'omit' from source: magic vars 33932 1726882881.95101: variable 'omit' from source: magic vars 33932 1726882881.95214: variable 'omit' from source: magic vars 33932 1726882881.95245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882881.95526: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882881.95549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882881.95576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882881.95592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882881.95737: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882881.95745: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882881.95753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882881.96092: Set connection var ansible_shell_executable to /bin/sh 33932 1726882881.96106: Set connection var ansible_timeout to 10 33932 1726882881.96130: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882881.96173: Set connection var ansible_pipelining to False 33932 1726882881.96182: Set connection var ansible_connection to ssh 33932 1726882881.96266: Set connection var ansible_shell_type to sh 33932 1726882881.96307: variable 'ansible_shell_executable' from source: unknown 33932 1726882881.96315: variable 'ansible_connection' from source: unknown 33932 1726882881.96324: variable 'ansible_module_compression' from source: unknown 33932 1726882881.96331: variable 'ansible_shell_type' from source: unknown 33932 1726882881.96338: variable 'ansible_shell_executable' from source: unknown 33932 1726882881.96345: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882881.96353: variable 'ansible_pipelining' from source: unknown 33932 1726882881.96363: variable 'ansible_timeout' from source: unknown 33932 1726882881.96384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882881.96751: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882881.96772: variable 'omit' from source: magic vars 33932 1726882881.96783: starting attempt loop 33932 1726882881.96790: running the handler 33932 1726882881.96814: _low_level_execute_command(): starting 33932 1726882881.96836: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882881.98892: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882881.99404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.99408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882881.99452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882881.99456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882881.99458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882881.99514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882881.99651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882881.99654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882881.99765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882882.01437: stdout chunk (state=3): >>>/root <<< 33932 1726882882.01549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882882.01619: stderr chunk (state=3): >>><<< 33932 1726882882.01622: stdout chunk (state=3): >>><<< 33932 1726882882.01691: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882882.01702: _low_level_execute_command(): starting 33932 1726882882.01706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622 `" && echo ansible-tmp-1726882882.0164418-34075-242031351729622="` echo /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622 `" ) && sleep 0' 33932 1726882882.02822: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.02826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.02987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 33932 1726882882.03583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882882.03694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882882.05751: stdout chunk (state=3): >>>ansible-tmp-1726882882.0164418-34075-242031351729622=/root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622 <<< 33932 1726882882.05801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882882.05872: stderr chunk (state=3): >>><<< 33932 1726882882.05884: stdout chunk (state=3): >>><<< 33932 1726882882.06077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882882.0164418-34075-242031351729622=/root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882882.06080: variable 'ansible_module_compression' from source: unknown 33932 1726882882.06083: ANSIBALLZ: Using lock for stat 33932 1726882882.06085: ANSIBALLZ: Acquiring lock 33932 1726882882.06087: ANSIBALLZ: Lock acquired: 140301142160208 33932 1726882882.06088: ANSIBALLZ: Creating module 33932 1726882882.24349: ANSIBALLZ: Writing module into payload 33932 1726882882.24495: ANSIBALLZ: Writing module 33932 1726882882.24520: ANSIBALLZ: Renaming module 33932 1726882882.24530: ANSIBALLZ: Done creating module 33932 1726882882.24549: variable 'ansible_facts' from source: unknown 33932 1726882882.24646: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622/AnsiballZ_stat.py 33932 1726882882.24822: Sending initial data 33932 1726882882.24825: Sent initial data (153 bytes) 33932 1726882882.25860: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882882.25881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.25897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.25927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.25966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.25980: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882882.25991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.26005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882882.26022: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882882.26033: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882882.26043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.26054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.26072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.26083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.26092: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882882.26103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.26210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882882.26228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882882.26258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882882.26417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882882.29036: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882882.29139: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882882.29254: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpdamz6luz /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622/AnsiballZ_stat.py <<< 33932 1726882882.29359: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882882.30811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882882.30896: stderr chunk (state=3): >>><<< 33932 1726882882.30900: stdout chunk (state=3): >>><<< 33932 1726882882.30919: done transferring module to remote 33932 1726882882.30935: _low_level_execute_command(): starting 33932 1726882882.30940: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622/ /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622/AnsiballZ_stat.py && sleep 0' 33932 1726882882.31626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882882.31635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.31645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.31658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.31708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.31715: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882882.31725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.31749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882882.31752: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882882.31756: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882882.31761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.31780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.31792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.31799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.31806: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882882.31814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.31899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882882.31918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882882.31931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882882.32059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882882.34687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882882.34691: stdout chunk (state=3): >>><<< 33932 1726882882.34697: stderr chunk (state=3): >>><<< 33932 1726882882.34715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882882.34719: _low_level_execute_command(): starting 33932 1726882882.34721: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622/AnsiballZ_stat.py && sleep 0' 33932 1726882882.35480: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882882.35490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.35504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.35516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.35552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.35558: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882882.35569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.35585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882882.35591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882882.35597: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882882.35606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.35616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.35631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.35636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.35642: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882882.35651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.35732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882882.35749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882882.35753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882882.35890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882882.38699: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 33932 1726882882.38739: stdout chunk (state=3): >>>import '_thread' # <<< 33932 1726882882.38743: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 33932 1726882882.38752: stdout chunk (state=3): >>> <<< 33932 1726882882.38855: stdout chunk (state=3): >>>import '_io' # <<< 33932 1726882882.38858: stdout chunk (state=3): >>>import 'marshal' # <<< 33932 1726882882.38861: stdout chunk (state=3): >>> <<< 33932 1726882882.38927: stdout chunk (state=3): >>>import 'posix' # <<< 33932 1726882882.38982: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 33932 1726882882.38985: stdout chunk (state=3): >>># installing zipimport hook <<< 33932 1726882882.39036: stdout chunk (state=3): >>>import 'time' # <<< 33932 1726882882.39058: stdout chunk (state=3): >>>import 'zipimport' # <<< 33932 1726882882.39076: stdout chunk (state=3): >>> <<< 33932 1726882882.39083: stdout chunk (state=3): >>># installed zipimport hook <<< 33932 1726882882.39388: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadf3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadf3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadf3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98490> <<< 33932 1726882882.39395: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 33932 1726882882.39412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 33932 1726882882.39428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 33932 1726882882.39446: stdout chunk (state=3): >>>import '_abc' # <<< 33932 1726882882.39449: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98940> <<< 33932 1726882882.39470: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98670> <<< 33932 1726882882.39512: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 33932 1726882882.39516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 33932 1726882882.39562: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 33932 1726882882.39565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 33932 1726882882.39589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 33932 1726882882.39608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 33932 1726882882.39639: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad4f190> <<< 33932 1726882882.39672: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 33932 1726882882.39688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 33932 1726882882.39792: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad4f220> <<< 33932 1726882882.39823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 33932 1726882882.39830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 33932 1726882882.39880: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad4f940> <<< 33932 1726882882.39948: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadb0880> <<< 33932 1726882882.39954: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad48d90> <<< 33932 1726882882.40020: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 33932 1726882882.40041: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad72d90> <<< 33932 1726882882.40116: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98970> <<< 33932 1726882882.40153: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33932 1726882882.40484: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 33932 1726882882.40488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 33932 1726882882.40541: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 33932 1726882882.40544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 33932 1726882882.40558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 33932 1726882882.40578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 33932 1726882882.40602: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 33932 1726882882.40622: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6faceeeb0> <<< 33932 1726882882.40727: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facf1f40> <<< 33932 1726882882.40744: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 33932 1726882882.40759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # <<< 33932 1726882882.40786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 33932 1726882882.40799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 33932 1726882882.40849: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 33932 1726882882.40916: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6face7610> <<< 33932 1726882882.40925: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6faced640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facee370> <<< 33932 1726882882.40928: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 33932 1726882882.41046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 33932 1726882882.41053: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 33932 1726882882.41117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882882.41127: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 33932 1726882882.41135: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.41185: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fac6fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac6f910> import 'itertools' # <<< 33932 1726882882.41205: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac6ff10> <<< 33932 1726882882.41209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 33932 1726882882.41222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 33932 1726882882.41276: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac6ffd0> <<< 33932 1726882882.41295: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 33932 1726882882.41314: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac820d0> <<< 33932 1726882882.41317: stdout chunk (state=3): >>>import '_collections' # <<< 33932 1726882882.41397: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facc9d90> <<< 33932 1726882882.41404: stdout chunk (state=3): >>>import '_functools' # <<< 33932 1726882882.41515: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facc2670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facd56d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facf5e20> <<< 33932 1726882882.41585: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 33932 1726882882.41590: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fac82cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facc92b0> <<< 33932 1726882882.41830: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6facd52e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facfb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 33932 1726882882.41834: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82d60> <<< 33932 1726882882.41837: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 33932 1726882882.41865: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 33932 1726882882.41871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 33932 1726882882.41900: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 33932 1726882882.41952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 33932 1726882882.41999: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac553d0> <<< 33932 1726882882.42096: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac554c0> <<< 33932 1726882882.42267: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac89f40> <<< 33932 1726882882.42453: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac84a90> <<< 33932 1726882882.42482: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac84490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 33932 1726882882.42486: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa96e220> <<< 33932 1726882882.42558: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac40520> <<< 33932 1726882882.42704: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac84f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facfb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 33932 1726882882.42707: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 33932 1726882882.42710: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa980b50> <<< 33932 1726882882.42712: stdout chunk (state=3): >>>import 'errno' # <<< 33932 1726882882.42816: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa980e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 33932 1726882882.42824: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 33932 1726882882.42827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 33932 1726882882.42849: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa991790> <<< 33932 1726882882.42861: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 33932 1726882882.42902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 33932 1726882882.43022: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa991cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa91f400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa980f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 33932 1726882882.43036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 33932 1726882882.43301: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa9302e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa991610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa9303a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 33932 1726882882.43329: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.43335: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94b700> <<< 33932 1726882882.43407: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94b9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa94b7c0> <<< 33932 1726882882.43437: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.43440: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94b8b0> <<< 33932 1726882882.43503: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 33932 1726882882.43795: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94bd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa956250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa94b940> <<< 33932 1726882882.43808: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa93fa90> <<< 33932 1726882882.43835: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82610> <<< 33932 1726882882.43940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 33932 1726882882.43981: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa94baf0> <<< 33932 1726882882.44108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 33932 1726882882.44128: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb6fa86f6d0> <<< 33932 1726882882.44356: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip' # zipimport: zlib available <<< 33932 1726882882.44505: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.44534: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/__init__.py <<< 33932 1726882882.44549: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.44562: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.44585: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 33932 1726882882.44597: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.46571: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.48172: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 33932 1726882882.48187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa76c820> <<< 33932 1726882882.48273: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 33932 1726882882.48276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882882.48280: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 33932 1726882882.48282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 33932 1726882882.48575: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 33932 1726882882.48578: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.48582: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa7fc730> <<< 33932 1726882882.48585: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc610> <<< 33932 1726882882.48587: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc340> <<< 33932 1726882882.48589: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 33932 1726882882.48591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 33932 1726882882.48599: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc460> <<< 33932 1726882882.48616: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc160> <<< 33932 1726882882.48628: stdout chunk (state=3): >>>import 'atexit' # <<< 33932 1726882882.48678: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.48685: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa7fc3a0> <<< 33932 1726882882.48711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 33932 1726882882.48770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 33932 1726882882.48822: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc790> <<< 33932 1726882882.48853: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 33932 1726882882.48879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 33932 1726882882.48907: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 33932 1726882882.48941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 33932 1726882882.48977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 33932 1726882882.48983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 33932 1726882882.49117: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1ad7f0> <<< 33932 1726882882.49170: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.49176: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa1adb80> <<< 33932 1726882882.49236: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.49243: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa1ad9d0> <<< 33932 1726882882.49277: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 33932 1726882882.49336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 33932 1726882882.49391: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1ccaf0> <<< 33932 1726882882.49423: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7f5d60> <<< 33932 1726882882.49695: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc4f0> <<< 33932 1726882882.49731: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 33932 1726882882.49737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 33932 1726882882.49769: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7f51c0> <<< 33932 1726882882.49799: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 33932 1726882882.49826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 33932 1726882882.49867: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 33932 1726882882.49876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 33932 1726882882.49917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 33932 1726882882.49940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 33932 1726882882.49980: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc'<<< 33932 1726882882.50007: stdout chunk (state=3): >>> <<< 33932 1726882882.50013: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa768b20> <<< 33932 1726882882.50154: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa79deb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa79d8b0><<< 33932 1726882882.50174: stdout chunk (state=3): >>> <<< 33932 1726882882.50181: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1c7d30> <<< 33932 1726882882.50221: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so'<<< 33932 1726882882.50252: stdout chunk (state=3): >>> <<< 33932 1726882882.50260: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa79d9a0> <<< 33932 1726882882.50306: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py<<< 33932 1726882882.50325: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7ccd00> <<< 33932 1726882882.50370: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 33932 1726882882.50398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc'<<< 33932 1726882882.50405: stdout chunk (state=3): >>> <<< 33932 1726882882.50429: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 33932 1726882882.50485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 33932 1726882882.50608: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so'<<< 33932 1726882882.50622: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa18ea00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7d4e80> <<< 33932 1726882882.50649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 33932 1726882882.50687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 33932 1726882882.50771: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so'<<< 33932 1726882882.50799: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa19d0a0><<< 33932 1726882882.50805: stdout chunk (state=3): >>> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7d4eb0> <<< 33932 1726882882.50834: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py<<< 33932 1726882882.50840: stdout chunk (state=3): >>> <<< 33932 1726882882.50916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882882.50947: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 33932 1726882882.50966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 33932 1726882882.50980: stdout chunk (state=3): >>>import '_string' # <<< 33932 1726882882.51087: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7a1730> <<< 33932 1726882882.51318: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa19d0d0> <<< 33932 1726882882.51470: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.51476: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa19a550> <<< 33932 1726882882.51520: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so'<<< 33932 1726882882.51532: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa19a610> <<< 33932 1726882882.51608: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.51622: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa199c40> <<< 33932 1726882882.51638: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7ccee0> <<< 33932 1726882882.51672: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py<<< 33932 1726882882.51688: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 33932 1726882882.51717: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 33932 1726882882.51745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 33932 1726882882.51831: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.51837: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa75cb50> <<< 33932 1726882882.52171: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.52190: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa75b940> <<< 33932 1726882882.52197: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa190820> <<< 33932 1726882882.52244: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.52265: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa75c5b0><<< 33932 1726882882.52287: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa795af0> <<< 33932 1726882882.52306: stdout chunk (state=3): >>># zipimport: zlib available<<< 33932 1726882882.52312: stdout chunk (state=3): >>> <<< 33932 1726882882.52337: stdout chunk (state=3): >>># zipimport: zlib available<<< 33932 1726882882.52345: stdout chunk (state=3): >>> <<< 33932 1726882882.52527: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 33932 1726882882.52645: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.52674: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.52684: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 33932 1726882882.52703: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.52727: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.52745: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 33932 1726882882.52777: stdout chunk (state=3): >>># zipimport: zlib available<<< 33932 1726882882.52783: stdout chunk (state=3): >>> <<< 33932 1726882882.52949: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.53113: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.53901: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.54697: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 33932 1726882882.54721: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 33932 1726882882.54747: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py<<< 33932 1726882882.54753: stdout chunk (state=3): >>> <<< 33932 1726882882.54794: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 33932 1726882882.54798: stdout chunk (state=3): >>> <<< 33932 1726882882.54823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc'<<< 33932 1726882882.54829: stdout chunk (state=3): >>> <<< 33932 1726882882.54976: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6f9d85df0><<< 33932 1726882882.54982: stdout chunk (state=3): >>> <<< 33932 1726882882.55060: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 33932 1726882882.55087: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1695b0> <<< 33932 1726882882.55105: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa15bdf0> <<< 33932 1726882882.55187: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 33932 1726882882.55213: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.55243: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.55280: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 33932 1726882882.55295: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.55496: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.55711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 33932 1726882882.55720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 33932 1726882882.55755: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7529d0> <<< 33932 1726882882.55778: stdout chunk (state=3): >>># zipimport: zlib available<<< 33932 1726882882.55784: stdout chunk (state=3): >>> <<< 33932 1726882882.56461: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57103: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57199: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57316: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 33932 1726882882.57320: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57441: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available<<< 33932 1726882882.57444: stdout chunk (state=3): >>> <<< 33932 1726882882.57536: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57648: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 33932 1726882882.57678: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57705: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57713: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 33932 1726882882.57737: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57807: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.57859: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 33932 1726882882.57886: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.58202: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.58530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 33932 1726882882.58580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 33932 1726882882.58600: stdout chunk (state=3): >>>import '_ast' # <<< 33932 1726882882.58606: stdout chunk (state=3): >>> <<< 33932 1726882882.58724: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6f9d56e50> <<< 33932 1726882882.58742: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.58849: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.58999: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 33932 1726882882.59002: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 33932 1726882882.59058: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 33932 1726882882.59114: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.59190: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 33932 1726882882.59273: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.59322: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.59473: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.59598: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 33932 1726882882.59648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 33932 1726882882.59815: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 33932 1726882882.59822: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa7e6910> <<< 33932 1726882882.59894: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6f9d56be0><<< 33932 1726882882.59900: stdout chunk (state=3): >>> <<< 33932 1726882882.59958: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 33932 1726882882.59978: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 33932 1726882882.60001: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.60290: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.60385: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.60420: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.60481: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 33932 1726882882.60509: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 33932 1726882882.60541: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 33932 1726882882.60611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 33932 1726882882.60639: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 33932 1726882882.60672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 33932 1726882882.60860: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6f9d18c70> <<< 33932 1726882882.60915: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa15d670><<< 33932 1726882882.60921: stdout chunk (state=3): >>> <<< 33932 1726882882.61025: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa15c850> # destroy ansible.module_utils.distro<<< 33932 1726882882.61042: stdout chunk (state=3): >>> import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 33932 1726882882.61089: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.61133: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py<<< 33932 1726882882.61139: stdout chunk (state=3): >>> <<< 33932 1726882882.61335: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 33932 1726882882.61505: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.61786: stdout chunk (state=3): >>># zipimport: zlib available <<< 33932 1726882882.62043: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 33932 1726882882.62077: stdout chunk (state=3): >>># destroy __main__ <<< 33932 1726882882.62545: stdout chunk (state=3): >>># clear builtins._ <<< 33932 1726882882.62636: stdout chunk (state=3): >>># clear sys.path # clear sys.argv <<< 33932 1726882882.62713: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 <<< 33932 1726882882.62755: stdout chunk (state=3): >>># clear sys.last_type <<< 33932 1726882882.62818: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback <<< 33932 1726882882.62917: stdout chunk (state=3): >>># clear sys.path_hooks <<< 33932 1726882882.62958: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 33932 1726882882.63041: stdout chunk (state=3): >>># clear sys.meta_path <<< 33932 1726882882.63092: stdout chunk (state=3): >>># clear sys.__interactivehook__ <<< 33932 1726882882.63147: stdout chunk (state=3): >>># restore sys.stdin<<< 33932 1726882882.63213: stdout chunk (state=3): >>> <<< 33932 1726882882.63277: stdout chunk (state=3): >>># restore sys.stdout<<< 33932 1726882882.63356: stdout chunk (state=3): >>> # restore sys.stderr<<< 33932 1726882882.63363: stdout chunk (state=3): >>> # cleanup[2] removing sys <<< 33932 1726882882.63398: stdout chunk (state=3): >>># cleanup[2] removing builtins <<< 33932 1726882882.63461: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib <<< 33932 1726882882.63466: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread<<< 33932 1726882882.63484: stdout chunk (state=3): >>> # cleanup[2] removing _warnings <<< 33932 1726882882.63489: stdout chunk (state=3): >>># cleanup[2] removing _weakref<<< 33932 1726882882.63495: stdout chunk (state=3): >>> # cleanup[2] removing _io # cleanup[2] removing marshal<<< 33932 1726882882.63503: stdout chunk (state=3): >>> # cleanup[2] removing posix<<< 33932 1726882882.63505: stdout chunk (state=3): >>> <<< 33932 1726882882.63507: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external <<< 33932 1726882882.63512: stdout chunk (state=3): >>># cleanup[2] removing time # cleanup[2] removing zipimport<<< 33932 1726882882.63527: stdout chunk (state=3): >>> # cleanup[2] removing _codecs <<< 33932 1726882882.63539: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 33932 1726882882.63550: stdout chunk (state=3): >>># cleanup[2] removing _signal <<< 33932 1726882882.63576: stdout chunk (state=3): >>># cleanup[2] removing encodings.latin_1<<< 33932 1726882882.63586: stdout chunk (state=3): >>> # cleanup[2] removing _abc<<< 33932 1726882882.63598: stdout chunk (state=3): >>> <<< 33932 1726882882.63603: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc <<< 33932 1726882882.63609: stdout chunk (state=3): >>># cleanup[2] removing importlib.util <<< 33932 1726882882.63612: stdout chunk (state=3): >>># cleanup[2] removing _weakrefset <<< 33932 1726882882.63614: stdout chunk (state=3): >>># destroy _weakrefset <<< 33932 1726882882.63620: stdout chunk (state=3): >>># cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__<<< 33932 1726882882.63625: stdout chunk (state=3): >>> # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess<<< 33932 1726882882.63628: stdout chunk (state=3): >>> # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token<<< 33932 1726882882.63632: stdout chunk (state=3): >>> # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six<<< 33932 1726882882.63635: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes<<< 33932 1726882882.63637: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors<<< 33932 1726882882.63648: stdout chunk (state=3): >>> # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters<<< 33932 1726882882.63675: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec <<< 33932 1726882882.63691: stdout chunk (state=3): >>># destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux<<< 33932 1726882882.63699: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info <<< 33932 1726882882.63975: stdout chunk (state=3): >>># destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins <<< 33932 1726882882.64015: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 33932 1726882882.64050: stdout chunk (state=3): >>># destroy zipimport <<< 33932 1726882882.64124: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib <<< 33932 1726882882.64132: stdout chunk (state=3): >>># destroy struct # destroy bz2 # destroy lzma <<< 33932 1726882882.64199: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile <<< 33932 1726882882.64246: stdout chunk (state=3): >>># destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder<<< 33932 1726882882.64260: stdout chunk (state=3): >>> # destroy json.scanner <<< 33932 1726882882.64290: stdout chunk (state=3): >>># destroy _json <<< 33932 1726882882.64298: stdout chunk (state=3): >>># destroy encodings <<< 33932 1726882882.64343: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 33932 1726882882.64382: stdout chunk (state=3): >>># destroy array<<< 33932 1726882882.64390: stdout chunk (state=3): >>> # destroy datetime <<< 33932 1726882882.64443: stdout chunk (state=3): >>># destroy selinux # destroy distro <<< 33932 1726882882.64471: stdout chunk (state=3): >>># destroy json # destroy shlex # destroy logging <<< 33932 1726882882.64474: stdout chunk (state=3): >>># destroy argparse <<< 33932 1726882882.64580: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 33932 1726882882.64625: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket<<< 33932 1726882882.64654: stdout chunk (state=3): >>> # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid<<< 33932 1726882882.64681: stdout chunk (state=3): >>> # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 33932 1726882882.64700: stdout chunk (state=3): >>># cleanup[3] wiping tokenize <<< 33932 1726882882.64723: stdout chunk (state=3): >>># cleanup[3] wiping platform <<< 33932 1726882882.64759: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors<<< 33932 1726882882.64784: stdout chunk (state=3): >>> # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess <<< 33932 1726882882.64821: stdout chunk (state=3): >>># cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 33932 1726882882.64853: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random<<< 33932 1726882882.64879: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 33932 1726882882.64892: stdout chunk (state=3): >>># cleanup[3] wiping shutil # destroy fnmatch <<< 33932 1726882882.64916: stdout chunk (state=3): >>># cleanup[3] wiping grp <<< 33932 1726882882.64966: stdout chunk (state=3): >>># cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib<<< 33932 1726882882.64979: stdout chunk (state=3): >>> # cleanup[3] wiping errno # cleanup[3] wiping weakref<<< 33932 1726882882.65012: stdout chunk (state=3): >>> # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 33932 1726882882.65037: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 33932 1726882882.65051: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re <<< 33932 1726882882.65091: stdout chunk (state=3): >>># destroy enum # destroy sre_compile<<< 33932 1726882882.65107: stdout chunk (state=3): >>> # destroy copyreg # cleanup[3] wiping functools <<< 33932 1726882882.65146: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools<<< 33932 1726882882.65161: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 33932 1726882882.65181: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools<<< 33932 1726882882.65240: stdout chunk (state=3): >>> # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre <<< 33932 1726882882.65253: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath<<< 33932 1726882882.65282: stdout chunk (state=3): >>> # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases<<< 33932 1726882882.65318: stdout chunk (state=3): >>> # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 33932 1726882882.65340: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings<<< 33932 1726882882.65380: stdout chunk (state=3): >>> # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 33932 1726882882.65387: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal<<< 33932 1726882882.65634: stdout chunk (state=3): >>> # destroy platform # destroy _uuid <<< 33932 1726882882.65649: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse <<< 33932 1726882882.65677: stdout chunk (state=3): >>># destroy tokenize <<< 33932 1726882882.65688: stdout chunk (state=3): >>># destroy _heapq <<< 33932 1726882882.65700: stdout chunk (state=3): >>># destroy posixpath <<< 33932 1726882882.65724: stdout chunk (state=3): >>># destroy stat <<< 33932 1726882882.65818: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno<<< 33932 1726882882.65956: stdout chunk (state=3): >>> # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 33932 1726882882.65991: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external <<< 33932 1726882882.65999: stdout chunk (state=3): >>># destroy _imp # destroy io # destroy marshal <<< 33932 1726882882.66072: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 33932 1726882882.66552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882882.66556: stdout chunk (state=3): >>><<< 33932 1726882882.66565: stderr chunk (state=3): >>><<< 33932 1726882882.66662: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadf3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadf3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadf3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad4f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad4f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fadb0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad48d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad72d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fad98970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6faceeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facf1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6face7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6faced640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facee370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fac6fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac6f910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac6ff10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac6ffd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac820d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facc9d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facc2670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facd56d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facf5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fac82cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facc92b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6facd52e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facfb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac553d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac554c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac89f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac84a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac84490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa96e220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac40520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac84f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6facfb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa980b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa980e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa991790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa991cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa91f400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa980f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa9302e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa991610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa9303a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94b700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94b9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa94b7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94b8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa94bd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa956250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa94b940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa93fa90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fac82610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa94baf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb6fa86f6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa76c820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa7fc730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa7fc3a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1ad7f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa1adb80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa1ad9d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1ccaf0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7f5d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7fc4f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7f51c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa768b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa79deb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa79d8b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1c7d30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa79d9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7ccd00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa18ea00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7d4e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa19d0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7d4eb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7a1730> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa19d0d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa19a550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa19a610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa199c40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7ccee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa75cb50> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa75b940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa190820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa75c5b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa795af0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6f9d85df0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa1695b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa15bdf0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa7529d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6f9d56e50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb6fa7e6910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6f9d56be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6f9d18c70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa15d670> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb6fa15c850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_kydamcz4/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 33932 1726882882.67211: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882882.67214: _low_level_execute_command(): starting 33932 1726882882.67216: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882882.0164418-34075-242031351729622/ > /dev/null 2>&1 && sleep 0' 33932 1726882882.67410: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882882.67420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.67428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.67441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.67487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.67493: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882882.67503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.67515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882882.67523: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882882.67530: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882882.67536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882882.67545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882882.67555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882882.67574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882882.67580: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882882.67590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882882.67660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882882.67684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882882.67698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882882.67819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882882.70490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882882.70494: stdout chunk (state=3): >>><<< 33932 1726882882.70500: stderr chunk (state=3): >>><<< 33932 1726882882.70574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882882.70578: handler run complete 33932 1726882882.70624: attempt loop complete, returning result 33932 1726882882.70627: _execute() done 33932 1726882882.70630: dumping result to json 33932 1726882882.70632: done dumping result, returning 33932 1726882882.70641: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-615b-5c48-0000000000c2] 33932 1726882882.70645: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c2 33932 1726882882.70760: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c2 33932 1726882882.70762: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 33932 1726882882.70916: no more pending results, returning what we have 33932 1726882882.70920: results queue empty 33932 1726882882.70921: checking for any_errors_fatal 33932 1726882882.70930: done checking for any_errors_fatal 33932 1726882882.70931: checking for max_fail_percentage 33932 1726882882.70933: done checking for max_fail_percentage 33932 1726882882.70933: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.70934: done checking to see if all hosts have failed 33932 1726882882.70935: getting the remaining hosts for this loop 33932 1726882882.70937: done getting the remaining hosts for this loop 33932 1726882882.70941: getting the next task for host managed_node1 33932 1726882882.70947: done getting next task for host managed_node1 33932 1726882882.70950: ^ task is: TASK: Set flag to indicate system is ostree 33932 1726882882.70953: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.70956: getting variables 33932 1726882882.70958: in VariableManager get_vars() 33932 1726882882.70991: Calling all_inventory to load vars for managed_node1 33932 1726882882.70995: Calling groups_inventory to load vars for managed_node1 33932 1726882882.70998: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.71010: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.71013: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.71016: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.71325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.71646: done with get_vars() 33932 1726882882.71658: done getting variables 33932 1726882882.71767: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:41:22 -0400 (0:00:00.827) 0:00:03.185 ****** 33932 1726882882.71803: entering _queue_task() for managed_node1/set_fact 33932 1726882882.71805: Creating lock for set_fact 33932 1726882882.72109: worker is 1 (out of 1 available) 33932 1726882882.72122: exiting _queue_task() for managed_node1/set_fact 33932 1726882882.72133: done queuing things up, now waiting for results queue to drain 33932 1726882882.72135: waiting for pending results... 33932 1726882882.72420: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 33932 1726882882.72660: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000c3 33932 1726882882.72681: variable 'ansible_search_path' from source: unknown 33932 1726882882.72687: variable 'ansible_search_path' from source: unknown 33932 1726882882.72733: calling self._execute() 33932 1726882882.72807: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.72825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.72838: variable 'omit' from source: magic vars 33932 1726882882.73381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882882.73613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882882.73660: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882882.73706: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882882.73742: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882882.73835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882882.73865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882882.73897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882882.73934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882882.74068: Evaluated conditional (not __network_is_ostree is defined): True 33932 1726882882.74081: variable 'omit' from source: magic vars 33932 1726882882.74127: variable 'omit' from source: magic vars 33932 1726882882.74313: variable '__ostree_booted_stat' from source: set_fact 33932 1726882882.74397: variable 'omit' from source: magic vars 33932 1726882882.74429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882882.74469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882882.74492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882882.74521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882882.74537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882882.74624: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882882.74633: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.74641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.74798: Set connection var ansible_shell_executable to /bin/sh 33932 1726882882.74811: Set connection var ansible_timeout to 10 33932 1726882882.74821: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882882.74830: Set connection var ansible_pipelining to False 33932 1726882882.74836: Set connection var ansible_connection to ssh 33932 1726882882.74842: Set connection var ansible_shell_type to sh 33932 1726882882.74873: variable 'ansible_shell_executable' from source: unknown 33932 1726882882.74887: variable 'ansible_connection' from source: unknown 33932 1726882882.74896: variable 'ansible_module_compression' from source: unknown 33932 1726882882.74911: variable 'ansible_shell_type' from source: unknown 33932 1726882882.74920: variable 'ansible_shell_executable' from source: unknown 33932 1726882882.74927: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.74935: variable 'ansible_pipelining' from source: unknown 33932 1726882882.74940: variable 'ansible_timeout' from source: unknown 33932 1726882882.74947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.75060: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882882.75079: variable 'omit' from source: magic vars 33932 1726882882.75090: starting attempt loop 33932 1726882882.75102: running the handler 33932 1726882882.75122: handler run complete 33932 1726882882.75136: attempt loop complete, returning result 33932 1726882882.75142: _execute() done 33932 1726882882.75148: dumping result to json 33932 1726882882.75155: done dumping result, returning 33932 1726882882.75168: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-615b-5c48-0000000000c3] 33932 1726882882.75177: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c3 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 33932 1726882882.75342: no more pending results, returning what we have 33932 1726882882.75346: results queue empty 33932 1726882882.75347: checking for any_errors_fatal 33932 1726882882.75355: done checking for any_errors_fatal 33932 1726882882.75356: checking for max_fail_percentage 33932 1726882882.75358: done checking for max_fail_percentage 33932 1726882882.75359: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.75360: done checking to see if all hosts have failed 33932 1726882882.75361: getting the remaining hosts for this loop 33932 1726882882.75365: done getting the remaining hosts for this loop 33932 1726882882.75369: getting the next task for host managed_node1 33932 1726882882.75379: done getting next task for host managed_node1 33932 1726882882.75383: ^ task is: TASK: Fix CentOS6 Base repo 33932 1726882882.75386: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.75391: getting variables 33932 1726882882.75393: in VariableManager get_vars() 33932 1726882882.75425: Calling all_inventory to load vars for managed_node1 33932 1726882882.75428: Calling groups_inventory to load vars for managed_node1 33932 1726882882.75432: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.75445: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.75449: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.75452: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.75679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.75899: done with get_vars() 33932 1726882882.75911: done getting variables 33932 1726882882.76254: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c3 33932 1726882882.76257: WORKER PROCESS EXITING 33932 1726882882.76312: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:41:22 -0400 (0:00:00.045) 0:00:03.231 ****** 33932 1726882882.76339: entering _queue_task() for managed_node1/copy 33932 1726882882.76568: worker is 1 (out of 1 available) 33932 1726882882.76579: exiting _queue_task() for managed_node1/copy 33932 1726882882.76591: done queuing things up, now waiting for results queue to drain 33932 1726882882.76593: waiting for pending results... 33932 1726882882.77153: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 33932 1726882882.77430: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000c5 33932 1726882882.77448: variable 'ansible_search_path' from source: unknown 33932 1726882882.77455: variable 'ansible_search_path' from source: unknown 33932 1726882882.77597: calling self._execute() 33932 1726882882.77744: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.77762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.77777: variable 'omit' from source: magic vars 33932 1726882882.78819: variable 'ansible_distribution' from source: facts 33932 1726882882.78946: Evaluated conditional (ansible_distribution == 'CentOS'): True 33932 1726882882.79234: variable 'ansible_distribution_major_version' from source: facts 33932 1726882882.79246: Evaluated conditional (ansible_distribution_major_version == '6'): False 33932 1726882882.79253: when evaluation is False, skipping this task 33932 1726882882.79259: _execute() done 33932 1726882882.79268: dumping result to json 33932 1726882882.79275: done dumping result, returning 33932 1726882882.79286: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-615b-5c48-0000000000c5] 33932 1726882882.79297: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 33932 1726882882.79537: no more pending results, returning what we have 33932 1726882882.79541: results queue empty 33932 1726882882.79542: checking for any_errors_fatal 33932 1726882882.79547: done checking for any_errors_fatal 33932 1726882882.79547: checking for max_fail_percentage 33932 1726882882.79549: done checking for max_fail_percentage 33932 1726882882.79550: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.79551: done checking to see if all hosts have failed 33932 1726882882.79552: getting the remaining hosts for this loop 33932 1726882882.79554: done getting the remaining hosts for this loop 33932 1726882882.79557: getting the next task for host managed_node1 33932 1726882882.79566: done getting next task for host managed_node1 33932 1726882882.79569: ^ task is: TASK: Include the task 'enable_epel.yml' 33932 1726882882.79584: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.79589: getting variables 33932 1726882882.79592: in VariableManager get_vars() 33932 1726882882.79628: Calling all_inventory to load vars for managed_node1 33932 1726882882.79631: Calling groups_inventory to load vars for managed_node1 33932 1726882882.79635: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.79670: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.79680: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.79684: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.79946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.80515: done with get_vars() 33932 1726882882.80524: done getting variables 33932 1726882882.80571: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c5 33932 1726882882.80574: WORKER PROCESS EXITING TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:41:22 -0400 (0:00:00.043) 0:00:03.274 ****** 33932 1726882882.80651: entering _queue_task() for managed_node1/include_tasks 33932 1726882882.80945: worker is 1 (out of 1 available) 33932 1726882882.80956: exiting _queue_task() for managed_node1/include_tasks 33932 1726882882.80969: done queuing things up, now waiting for results queue to drain 33932 1726882882.80971: waiting for pending results... 33932 1726882882.81283: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 33932 1726882882.81376: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000c6 33932 1726882882.81392: variable 'ansible_search_path' from source: unknown 33932 1726882882.81398: variable 'ansible_search_path' from source: unknown 33932 1726882882.81438: calling self._execute() 33932 1726882882.81507: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.81517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.81533: variable 'omit' from source: magic vars 33932 1726882882.81984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882882.85885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882882.85960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882882.86004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882882.86043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882882.86080: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882882.86159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882882.86198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882882.86229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882882.86301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882882.86321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882882.86443: variable '__network_is_ostree' from source: set_fact 33932 1726882882.86468: Evaluated conditional (not __network_is_ostree | d(false)): True 33932 1726882882.86480: _execute() done 33932 1726882882.86487: dumping result to json 33932 1726882882.86500: done dumping result, returning 33932 1726882882.86511: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-615b-5c48-0000000000c6] 33932 1726882882.86521: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c6 33932 1726882882.86649: no more pending results, returning what we have 33932 1726882882.86654: in VariableManager get_vars() 33932 1726882882.86768: Calling all_inventory to load vars for managed_node1 33932 1726882882.86771: Calling groups_inventory to load vars for managed_node1 33932 1726882882.86775: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.86786: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.86790: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.86820: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.87148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.87399: done with get_vars() 33932 1726882882.87407: variable 'ansible_search_path' from source: unknown 33932 1726882882.87409: variable 'ansible_search_path' from source: unknown 33932 1726882882.87446: we have included files to process 33932 1726882882.87448: generating all_blocks data 33932 1726882882.87449: done generating all_blocks data 33932 1726882882.87457: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33932 1726882882.87458: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33932 1726882882.87461: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33932 1726882882.88106: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000c6 33932 1726882882.88110: WORKER PROCESS EXITING 33932 1726882882.88780: done processing included file 33932 1726882882.88782: iterating over new_blocks loaded from include file 33932 1726882882.88783: in VariableManager get_vars() 33932 1726882882.88795: done with get_vars() 33932 1726882882.88796: filtering new block on tags 33932 1726882882.88880: done filtering new block on tags 33932 1726882882.88883: in VariableManager get_vars() 33932 1726882882.88893: done with get_vars() 33932 1726882882.88895: filtering new block on tags 33932 1726882882.88905: done filtering new block on tags 33932 1726882882.88906: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 33932 1726882882.88910: extending task lists for all hosts with included blocks 33932 1726882882.89183: done extending task lists 33932 1726882882.89184: done processing included files 33932 1726882882.89185: results queue empty 33932 1726882882.89186: checking for any_errors_fatal 33932 1726882882.89188: done checking for any_errors_fatal 33932 1726882882.89210: checking for max_fail_percentage 33932 1726882882.89211: done checking for max_fail_percentage 33932 1726882882.89212: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.89213: done checking to see if all hosts have failed 33932 1726882882.89214: getting the remaining hosts for this loop 33932 1726882882.89215: done getting the remaining hosts for this loop 33932 1726882882.89218: getting the next task for host managed_node1 33932 1726882882.89222: done getting next task for host managed_node1 33932 1726882882.89224: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 33932 1726882882.89248: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.89252: getting variables 33932 1726882882.89253: in VariableManager get_vars() 33932 1726882882.89261: Calling all_inventory to load vars for managed_node1 33932 1726882882.89265: Calling groups_inventory to load vars for managed_node1 33932 1726882882.89268: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.89272: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.89279: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.89282: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.89584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.90030: done with get_vars() 33932 1726882882.90070: done getting variables 33932 1726882882.90249: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 33932 1726882882.90545: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:41:22 -0400 (0:00:00.099) 0:00:03.373 ****** 33932 1726882882.90590: entering _queue_task() for managed_node1/command 33932 1726882882.90592: Creating lock for command 33932 1726882882.90870: worker is 1 (out of 1 available) 33932 1726882882.90882: exiting _queue_task() for managed_node1/command 33932 1726882882.90893: done queuing things up, now waiting for results queue to drain 33932 1726882882.90895: waiting for pending results... 33932 1726882882.91141: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 33932 1726882882.91240: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000e0 33932 1726882882.91257: variable 'ansible_search_path' from source: unknown 33932 1726882882.91266: variable 'ansible_search_path' from source: unknown 33932 1726882882.91304: calling self._execute() 33932 1726882882.91378: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.91388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.91400: variable 'omit' from source: magic vars 33932 1726882882.91752: variable 'ansible_distribution' from source: facts 33932 1726882882.91776: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 33932 1726882882.91905: variable 'ansible_distribution_major_version' from source: facts 33932 1726882882.91915: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 33932 1726882882.91921: when evaluation is False, skipping this task 33932 1726882882.91927: _execute() done 33932 1726882882.91933: dumping result to json 33932 1726882882.91940: done dumping result, returning 33932 1726882882.91949: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-615b-5c48-0000000000e0] 33932 1726882882.91958: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e0 33932 1726882882.92069: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e0 33932 1726882882.92076: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 33932 1726882882.92144: no more pending results, returning what we have 33932 1726882882.92148: results queue empty 33932 1726882882.92148: checking for any_errors_fatal 33932 1726882882.92150: done checking for any_errors_fatal 33932 1726882882.92151: checking for max_fail_percentage 33932 1726882882.92152: done checking for max_fail_percentage 33932 1726882882.92153: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.92155: done checking to see if all hosts have failed 33932 1726882882.92156: getting the remaining hosts for this loop 33932 1726882882.92158: done getting the remaining hosts for this loop 33932 1726882882.92161: getting the next task for host managed_node1 33932 1726882882.92168: done getting next task for host managed_node1 33932 1726882882.92171: ^ task is: TASK: Install yum-utils package 33932 1726882882.92175: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.92178: getting variables 33932 1726882882.92180: in VariableManager get_vars() 33932 1726882882.92208: Calling all_inventory to load vars for managed_node1 33932 1726882882.92211: Calling groups_inventory to load vars for managed_node1 33932 1726882882.92215: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.92227: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.92231: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.92234: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.92395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.92561: done with get_vars() 33932 1726882882.92573: done getting variables 33932 1726882882.92693: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:41:22 -0400 (0:00:00.021) 0:00:03.394 ****** 33932 1726882882.92722: entering _queue_task() for managed_node1/package 33932 1726882882.92725: Creating lock for package 33932 1726882882.93146: worker is 1 (out of 1 available) 33932 1726882882.93158: exiting _queue_task() for managed_node1/package 33932 1726882882.93170: done queuing things up, now waiting for results queue to drain 33932 1726882882.93172: waiting for pending results... 33932 1726882882.93403: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 33932 1726882882.93506: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000e1 33932 1726882882.93529: variable 'ansible_search_path' from source: unknown 33932 1726882882.93536: variable 'ansible_search_path' from source: unknown 33932 1726882882.93577: calling self._execute() 33932 1726882882.93721: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.93738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.93753: variable 'omit' from source: magic vars 33932 1726882882.94094: variable 'ansible_distribution' from source: facts 33932 1726882882.94111: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 33932 1726882882.94244: variable 'ansible_distribution_major_version' from source: facts 33932 1726882882.94257: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 33932 1726882882.94269: when evaluation is False, skipping this task 33932 1726882882.94283: _execute() done 33932 1726882882.94293: dumping result to json 33932 1726882882.94302: done dumping result, returning 33932 1726882882.94314: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-615b-5c48-0000000000e1] 33932 1726882882.94325: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e1 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 33932 1726882882.94460: no more pending results, returning what we have 33932 1726882882.94466: results queue empty 33932 1726882882.94467: checking for any_errors_fatal 33932 1726882882.94473: done checking for any_errors_fatal 33932 1726882882.94474: checking for max_fail_percentage 33932 1726882882.94475: done checking for max_fail_percentage 33932 1726882882.94477: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.94477: done checking to see if all hosts have failed 33932 1726882882.94478: getting the remaining hosts for this loop 33932 1726882882.94480: done getting the remaining hosts for this loop 33932 1726882882.94484: getting the next task for host managed_node1 33932 1726882882.94490: done getting next task for host managed_node1 33932 1726882882.94493: ^ task is: TASK: Enable EPEL 7 33932 1726882882.94497: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.94500: getting variables 33932 1726882882.94502: in VariableManager get_vars() 33932 1726882882.94577: Calling all_inventory to load vars for managed_node1 33932 1726882882.94580: Calling groups_inventory to load vars for managed_node1 33932 1726882882.94585: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.94599: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.94602: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.94606: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.94781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.94985: done with get_vars() 33932 1726882882.94994: done getting variables 33932 1726882882.95274: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882882.95295: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e1 33932 1726882882.95298: WORKER PROCESS EXITING TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:41:22 -0400 (0:00:00.025) 0:00:03.420 ****** 33932 1726882882.95309: entering _queue_task() for managed_node1/command 33932 1726882882.95514: worker is 1 (out of 1 available) 33932 1726882882.95526: exiting _queue_task() for managed_node1/command 33932 1726882882.95536: done queuing things up, now waiting for results queue to drain 33932 1726882882.95538: waiting for pending results... 33932 1726882882.95770: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 33932 1726882882.95866: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000e2 33932 1726882882.95889: variable 'ansible_search_path' from source: unknown 33932 1726882882.95897: variable 'ansible_search_path' from source: unknown 33932 1726882882.95934: calling self._execute() 33932 1726882882.96007: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.96017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.96028: variable 'omit' from source: magic vars 33932 1726882882.96376: variable 'ansible_distribution' from source: facts 33932 1726882882.96392: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 33932 1726882882.96523: variable 'ansible_distribution_major_version' from source: facts 33932 1726882882.96538: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 33932 1726882882.96545: when evaluation is False, skipping this task 33932 1726882882.96551: _execute() done 33932 1726882882.96557: dumping result to json 33932 1726882882.96565: done dumping result, returning 33932 1726882882.96576: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-615b-5c48-0000000000e2] 33932 1726882882.96586: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 33932 1726882882.96718: no more pending results, returning what we have 33932 1726882882.96721: results queue empty 33932 1726882882.96722: checking for any_errors_fatal 33932 1726882882.96727: done checking for any_errors_fatal 33932 1726882882.96728: checking for max_fail_percentage 33932 1726882882.96730: done checking for max_fail_percentage 33932 1726882882.96731: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.96732: done checking to see if all hosts have failed 33932 1726882882.96733: getting the remaining hosts for this loop 33932 1726882882.96734: done getting the remaining hosts for this loop 33932 1726882882.96738: getting the next task for host managed_node1 33932 1726882882.96744: done getting next task for host managed_node1 33932 1726882882.96746: ^ task is: TASK: Enable EPEL 8 33932 1726882882.96751: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.96753: getting variables 33932 1726882882.96755: in VariableManager get_vars() 33932 1726882882.96783: Calling all_inventory to load vars for managed_node1 33932 1726882882.96786: Calling groups_inventory to load vars for managed_node1 33932 1726882882.96789: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.96802: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.96806: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.96808: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.96984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.97181: done with get_vars() 33932 1726882882.97190: done getting variables 33932 1726882882.97257: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:41:22 -0400 (0:00:00.019) 0:00:03.440 ****** 33932 1726882882.97292: entering _queue_task() for managed_node1/command 33932 1726882882.97310: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e2 33932 1726882882.97319: WORKER PROCESS EXITING 33932 1726882882.97660: worker is 1 (out of 1 available) 33932 1726882882.97672: exiting _queue_task() for managed_node1/command 33932 1726882882.97682: done queuing things up, now waiting for results queue to drain 33932 1726882882.97684: waiting for pending results... 33932 1726882882.97910: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 33932 1726882882.98003: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000e3 33932 1726882882.98020: variable 'ansible_search_path' from source: unknown 33932 1726882882.98028: variable 'ansible_search_path' from source: unknown 33932 1726882882.98060: calling self._execute() 33932 1726882882.98122: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882882.98135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882882.98146: variable 'omit' from source: magic vars 33932 1726882882.98559: variable 'ansible_distribution' from source: facts 33932 1726882882.98582: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 33932 1726882882.98716: variable 'ansible_distribution_major_version' from source: facts 33932 1726882882.98727: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 33932 1726882882.98734: when evaluation is False, skipping this task 33932 1726882882.98741: _execute() done 33932 1726882882.98749: dumping result to json 33932 1726882882.98756: done dumping result, returning 33932 1726882882.98767: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-615b-5c48-0000000000e3] 33932 1726882882.98778: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e3 33932 1726882882.98877: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e3 33932 1726882882.98886: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 33932 1726882882.98938: no more pending results, returning what we have 33932 1726882882.98941: results queue empty 33932 1726882882.98942: checking for any_errors_fatal 33932 1726882882.98948: done checking for any_errors_fatal 33932 1726882882.98949: checking for max_fail_percentage 33932 1726882882.98950: done checking for max_fail_percentage 33932 1726882882.98952: checking to see if all hosts have failed and the running result is not ok 33932 1726882882.98953: done checking to see if all hosts have failed 33932 1726882882.98954: getting the remaining hosts for this loop 33932 1726882882.98955: done getting the remaining hosts for this loop 33932 1726882882.98959: getting the next task for host managed_node1 33932 1726882882.98969: done getting next task for host managed_node1 33932 1726882882.98972: ^ task is: TASK: Enable EPEL 6 33932 1726882882.98976: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882882.98979: getting variables 33932 1726882882.98981: in VariableManager get_vars() 33932 1726882882.99047: Calling all_inventory to load vars for managed_node1 33932 1726882882.99050: Calling groups_inventory to load vars for managed_node1 33932 1726882882.99055: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882882.99068: Calling all_plugins_play to load vars for managed_node1 33932 1726882882.99072: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882882.99076: Calling groups_plugins_play to load vars for managed_node1 33932 1726882882.99238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882882.99501: done with get_vars() 33932 1726882882.99510: done getting variables 33932 1726882882.99584: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:41:22 -0400 (0:00:00.023) 0:00:03.463 ****** 33932 1726882882.99617: entering _queue_task() for managed_node1/copy 33932 1726882883.00575: worker is 1 (out of 1 available) 33932 1726882883.00586: exiting _queue_task() for managed_node1/copy 33932 1726882883.00732: done queuing things up, now waiting for results queue to drain 33932 1726882883.00734: waiting for pending results... 33932 1726882883.01797: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 33932 1726882883.01930: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000e5 33932 1726882883.01949: variable 'ansible_search_path' from source: unknown 33932 1726882883.01958: variable 'ansible_search_path' from source: unknown 33932 1726882883.01999: calling self._execute() 33932 1726882883.02091: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882883.02103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882883.02117: variable 'omit' from source: magic vars 33932 1726882883.02499: variable 'ansible_distribution' from source: facts 33932 1726882883.02517: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 33932 1726882883.02634: variable 'ansible_distribution_major_version' from source: facts 33932 1726882883.02645: Evaluated conditional (ansible_distribution_major_version == '6'): False 33932 1726882883.02653: when evaluation is False, skipping this task 33932 1726882883.02660: _execute() done 33932 1726882883.02675: dumping result to json 33932 1726882883.02684: done dumping result, returning 33932 1726882883.02697: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-615b-5c48-0000000000e5] 33932 1726882883.02708: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 33932 1726882883.02850: no more pending results, returning what we have 33932 1726882883.02854: results queue empty 33932 1726882883.02855: checking for any_errors_fatal 33932 1726882883.02859: done checking for any_errors_fatal 33932 1726882883.02860: checking for max_fail_percentage 33932 1726882883.02861: done checking for max_fail_percentage 33932 1726882883.02863: checking to see if all hosts have failed and the running result is not ok 33932 1726882883.02865: done checking to see if all hosts have failed 33932 1726882883.02866: getting the remaining hosts for this loop 33932 1726882883.02868: done getting the remaining hosts for this loop 33932 1726882883.02871: getting the next task for host managed_node1 33932 1726882883.02880: done getting next task for host managed_node1 33932 1726882883.02883: ^ task is: TASK: Set network provider to 'nm' 33932 1726882883.02886: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882883.02889: getting variables 33932 1726882883.02891: in VariableManager get_vars() 33932 1726882883.02919: Calling all_inventory to load vars for managed_node1 33932 1726882883.02922: Calling groups_inventory to load vars for managed_node1 33932 1726882883.02925: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882883.02938: Calling all_plugins_play to load vars for managed_node1 33932 1726882883.02941: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882883.02944: Calling groups_plugins_play to load vars for managed_node1 33932 1726882883.03120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882883.03365: done with get_vars() 33932 1726882883.03375: done getting variables 33932 1726882883.03442: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:13 Friday 20 September 2024 21:41:23 -0400 (0:00:00.038) 0:00:03.502 ****** 33932 1726882883.03503: entering _queue_task() for managed_node1/set_fact 33932 1726882883.03520: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000e5 33932 1726882883.03545: WORKER PROCESS EXITING 33932 1726882883.04029: worker is 1 (out of 1 available) 33932 1726882883.04040: exiting _queue_task() for managed_node1/set_fact 33932 1726882883.04051: done queuing things up, now waiting for results queue to drain 33932 1726882883.04053: waiting for pending results... 33932 1726882883.04284: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 33932 1726882883.04369: in run() - task 0e448fcc-3ce9-615b-5c48-000000000007 33932 1726882883.04388: variable 'ansible_search_path' from source: unknown 33932 1726882883.04431: calling self._execute() 33932 1726882883.04572: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882883.04584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882883.04597: variable 'omit' from source: magic vars 33932 1726882883.04699: variable 'omit' from source: magic vars 33932 1726882883.04738: variable 'omit' from source: magic vars 33932 1726882883.04778: variable 'omit' from source: magic vars 33932 1726882883.04821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882883.04866: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882883.04890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882883.04912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882883.04927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882883.04966: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882883.04976: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882883.04985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882883.05092: Set connection var ansible_shell_executable to /bin/sh 33932 1726882883.05106: Set connection var ansible_timeout to 10 33932 1726882883.05116: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882883.05126: Set connection var ansible_pipelining to False 33932 1726882883.05133: Set connection var ansible_connection to ssh 33932 1726882883.05138: Set connection var ansible_shell_type to sh 33932 1726882883.05171: variable 'ansible_shell_executable' from source: unknown 33932 1726882883.05180: variable 'ansible_connection' from source: unknown 33932 1726882883.05187: variable 'ansible_module_compression' from source: unknown 33932 1726882883.05194: variable 'ansible_shell_type' from source: unknown 33932 1726882883.05201: variable 'ansible_shell_executable' from source: unknown 33932 1726882883.05208: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882883.05216: variable 'ansible_pipelining' from source: unknown 33932 1726882883.05222: variable 'ansible_timeout' from source: unknown 33932 1726882883.05229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882883.05370: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882883.05389: variable 'omit' from source: magic vars 33932 1726882883.05400: starting attempt loop 33932 1726882883.05407: running the handler 33932 1726882883.05423: handler run complete 33932 1726882883.05437: attempt loop complete, returning result 33932 1726882883.05444: _execute() done 33932 1726882883.05450: dumping result to json 33932 1726882883.05458: done dumping result, returning 33932 1726882883.05472: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0e448fcc-3ce9-615b-5c48-000000000007] 33932 1726882883.05489: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 33932 1726882883.05707: no more pending results, returning what we have 33932 1726882883.05710: results queue empty 33932 1726882883.05711: checking for any_errors_fatal 33932 1726882883.05717: done checking for any_errors_fatal 33932 1726882883.05717: checking for max_fail_percentage 33932 1726882883.05719: done checking for max_fail_percentage 33932 1726882883.05720: checking to see if all hosts have failed and the running result is not ok 33932 1726882883.05721: done checking to see if all hosts have failed 33932 1726882883.05722: getting the remaining hosts for this loop 33932 1726882883.05724: done getting the remaining hosts for this loop 33932 1726882883.05727: getting the next task for host managed_node1 33932 1726882883.05733: done getting next task for host managed_node1 33932 1726882883.05735: ^ task is: TASK: meta (flush_handlers) 33932 1726882883.05737: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882883.05742: getting variables 33932 1726882883.05744: in VariableManager get_vars() 33932 1726882883.05769: Calling all_inventory to load vars for managed_node1 33932 1726882883.05772: Calling groups_inventory to load vars for managed_node1 33932 1726882883.05776: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882883.05786: Calling all_plugins_play to load vars for managed_node1 33932 1726882883.05790: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882883.05793: Calling groups_plugins_play to load vars for managed_node1 33932 1726882883.05946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882883.06630: done with get_vars() 33932 1726882883.06639: done getting variables 33932 1726882883.06710: in VariableManager get_vars() 33932 1726882883.06718: Calling all_inventory to load vars for managed_node1 33932 1726882883.06720: Calling groups_inventory to load vars for managed_node1 33932 1726882883.06723: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882883.06727: Calling all_plugins_play to load vars for managed_node1 33932 1726882883.06729: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882883.06731: Calling groups_plugins_play to load vars for managed_node1 33932 1726882883.06868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882883.07048: done with get_vars() 33932 1726882883.07061: done queuing things up, now waiting for results queue to drain 33932 1726882883.07064: results queue empty 33932 1726882883.07065: checking for any_errors_fatal 33932 1726882883.07067: done checking for any_errors_fatal 33932 1726882883.07067: checking for max_fail_percentage 33932 1726882883.07068: done checking for max_fail_percentage 33932 1726882883.07069: checking to see if all hosts have failed and the running result is not ok 33932 1726882883.07069: done checking to see if all hosts have failed 33932 1726882883.07070: getting the remaining hosts for this loop 33932 1726882883.07071: done getting the remaining hosts for this loop 33932 1726882883.07073: getting the next task for host managed_node1 33932 1726882883.07078: done getting next task for host managed_node1 33932 1726882883.07079: ^ task is: TASK: meta (flush_handlers) 33932 1726882883.07081: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882883.07089: getting variables 33932 1726882883.07090: in VariableManager get_vars() 33932 1726882883.07097: Calling all_inventory to load vars for managed_node1 33932 1726882883.07099: Calling groups_inventory to load vars for managed_node1 33932 1726882883.07101: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882883.07106: Calling all_plugins_play to load vars for managed_node1 33932 1726882883.07108: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882883.07111: Calling groups_plugins_play to load vars for managed_node1 33932 1726882883.07474: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000007 33932 1726882883.07478: WORKER PROCESS EXITING 33932 1726882883.07494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882883.07670: done with get_vars() 33932 1726882883.07677: done getting variables 33932 1726882883.07719: in VariableManager get_vars() 33932 1726882883.07728: Calling all_inventory to load vars for managed_node1 33932 1726882883.07730: Calling groups_inventory to load vars for managed_node1 33932 1726882883.07733: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882883.07737: Calling all_plugins_play to load vars for managed_node1 33932 1726882883.07739: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882883.07742: Calling groups_plugins_play to load vars for managed_node1 33932 1726882883.07871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882883.08047: done with get_vars() 33932 1726882883.08058: done queuing things up, now waiting for results queue to drain 33932 1726882883.08060: results queue empty 33932 1726882883.08061: checking for any_errors_fatal 33932 1726882883.08062: done checking for any_errors_fatal 33932 1726882883.08063: checking for max_fail_percentage 33932 1726882883.08066: done checking for max_fail_percentage 33932 1726882883.08066: checking to see if all hosts have failed and the running result is not ok 33932 1726882883.08067: done checking to see if all hosts have failed 33932 1726882883.08068: getting the remaining hosts for this loop 33932 1726882883.08069: done getting the remaining hosts for this loop 33932 1726882883.08071: getting the next task for host managed_node1 33932 1726882883.08074: done getting next task for host managed_node1 33932 1726882883.08075: ^ task is: None 33932 1726882883.08076: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882883.08077: done queuing things up, now waiting for results queue to drain 33932 1726882883.08078: results queue empty 33932 1726882883.08079: checking for any_errors_fatal 33932 1726882883.08080: done checking for any_errors_fatal 33932 1726882883.08080: checking for max_fail_percentage 33932 1726882883.08081: done checking for max_fail_percentage 33932 1726882883.08082: checking to see if all hosts have failed and the running result is not ok 33932 1726882883.08083: done checking to see if all hosts have failed 33932 1726882883.08084: getting the next task for host managed_node1 33932 1726882883.08086: done getting next task for host managed_node1 33932 1726882883.08087: ^ task is: None 33932 1726882883.08088: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882883.08384: in VariableManager get_vars() 33932 1726882883.08407: done with get_vars() 33932 1726882883.08413: in VariableManager get_vars() 33932 1726882883.08429: done with get_vars() 33932 1726882883.08433: variable 'omit' from source: magic vars 33932 1726882883.08469: in VariableManager get_vars() 33932 1726882883.08487: done with get_vars() 33932 1726882883.08511: variable 'omit' from source: magic vars PLAY [Play for testing vlan mtu setting] *************************************** 33932 1726882883.08869: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 33932 1726882883.09546: getting the remaining hosts for this loop 33932 1726882883.09547: done getting the remaining hosts for this loop 33932 1726882883.09550: getting the next task for host managed_node1 33932 1726882883.09553: done getting next task for host managed_node1 33932 1726882883.09555: ^ task is: TASK: Gathering Facts 33932 1726882883.09557: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882883.09559: getting variables 33932 1726882883.09560: in VariableManager get_vars() 33932 1726882883.09577: Calling all_inventory to load vars for managed_node1 33932 1726882883.09615: Calling groups_inventory to load vars for managed_node1 33932 1726882883.09619: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882883.09624: Calling all_plugins_play to load vars for managed_node1 33932 1726882883.09638: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882883.09641: Calling groups_plugins_play to load vars for managed_node1 33932 1726882883.09791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882883.09972: done with get_vars() 33932 1726882883.09980: done getting variables 33932 1726882883.10020: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Friday 20 September 2024 21:41:23 -0400 (0:00:00.065) 0:00:03.568 ****** 33932 1726882883.10044: entering _queue_task() for managed_node1/gather_facts 33932 1726882883.10747: worker is 1 (out of 1 available) 33932 1726882883.10759: exiting _queue_task() for managed_node1/gather_facts 33932 1726882883.10772: done queuing things up, now waiting for results queue to drain 33932 1726882883.10774: waiting for pending results... 33932 1726882883.12327: running TaskExecutor() for managed_node1/TASK: Gathering Facts 33932 1726882883.12553: in run() - task 0e448fcc-3ce9-615b-5c48-00000000010b 33932 1726882883.12693: variable 'ansible_search_path' from source: unknown 33932 1726882883.12739: calling self._execute() 33932 1726882883.13711: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882883.13723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882883.13737: variable 'omit' from source: magic vars 33932 1726882883.14438: variable 'ansible_distribution_major_version' from source: facts 33932 1726882883.14592: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882883.14603: variable 'omit' from source: magic vars 33932 1726882883.14634: variable 'omit' from source: magic vars 33932 1726882883.14696: variable 'omit' from source: magic vars 33932 1726882883.14744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882883.14836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882883.14925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882883.15031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882883.15047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882883.15084: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882883.15093: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882883.15122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882883.15346: Set connection var ansible_shell_executable to /bin/sh 33932 1726882883.15361: Set connection var ansible_timeout to 10 33932 1726882883.15379: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882883.15450: Set connection var ansible_pipelining to False 33932 1726882883.15461: Set connection var ansible_connection to ssh 33932 1726882883.15475: Set connection var ansible_shell_type to sh 33932 1726882883.15506: variable 'ansible_shell_executable' from source: unknown 33932 1726882883.15515: variable 'ansible_connection' from source: unknown 33932 1726882883.15555: variable 'ansible_module_compression' from source: unknown 33932 1726882883.15572: variable 'ansible_shell_type' from source: unknown 33932 1726882883.15581: variable 'ansible_shell_executable' from source: unknown 33932 1726882883.15588: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882883.15596: variable 'ansible_pipelining' from source: unknown 33932 1726882883.15670: variable 'ansible_timeout' from source: unknown 33932 1726882883.15681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882883.15897: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882883.15914: variable 'omit' from source: magic vars 33932 1726882883.15922: starting attempt loop 33932 1726882883.15929: running the handler 33932 1726882883.15947: variable 'ansible_facts' from source: unknown 33932 1726882883.15990: _low_level_execute_command(): starting 33932 1726882883.16025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882883.16788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882883.16802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.16816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.16833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.16882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.16894: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882883.16907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.16924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882883.16935: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882883.16945: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882883.16956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.16972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.16992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.17003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.17015: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882883.17028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.17111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882883.17132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882883.17147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882883.17291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882883.19625: stdout chunk (state=3): >>>/root <<< 33932 1726882883.19871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882883.19874: stdout chunk (state=3): >>><<< 33932 1726882883.19876: stderr chunk (state=3): >>><<< 33932 1726882883.20007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882883.20011: _low_level_execute_command(): starting 33932 1726882883.20014: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738 `" && echo ansible-tmp-1726882883.1990604-34130-229506355445738="` echo /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738 `" ) && sleep 0' 33932 1726882883.20624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882883.20636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.20657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.20680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.20723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.20734: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882883.20747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.20773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882883.20786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882883.20796: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882883.20807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.20818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.20832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.20843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.20853: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882883.20866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.20948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882883.20972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882883.20995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882883.21156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882883.23924: stdout chunk (state=3): >>>ansible-tmp-1726882883.1990604-34130-229506355445738=/root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738 <<< 33932 1726882883.24189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882883.24192: stdout chunk (state=3): >>><<< 33932 1726882883.24195: stderr chunk (state=3): >>><<< 33932 1726882883.24459: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882883.1990604-34130-229506355445738=/root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882883.24463: variable 'ansible_module_compression' from source: unknown 33932 1726882883.24468: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 33932 1726882883.24471: variable 'ansible_facts' from source: unknown 33932 1726882883.24550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738/AnsiballZ_setup.py 33932 1726882883.25130: Sending initial data 33932 1726882883.25134: Sent initial data (154 bytes) 33932 1726882883.27395: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.27399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.27432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882883.27435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.27438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.28069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882883.28084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882883.28221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882883.30851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882883.30946: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882883.31046: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmptr70mvad /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738/AnsiballZ_setup.py <<< 33932 1726882883.31140: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882883.33970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882883.34319: stderr chunk (state=3): >>><<< 33932 1726882883.34335: stdout chunk (state=3): >>><<< 33932 1726882883.34412: done transferring module to remote 33932 1726882883.34491: _low_level_execute_command(): starting 33932 1726882883.34494: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738/ /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738/AnsiballZ_setup.py && sleep 0' 33932 1726882883.37110: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882883.37140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.37170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.37197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.37252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.37266: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882883.37283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.37300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882883.37311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882883.37322: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882883.37336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.37349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.37362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.37379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.37394: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882883.37418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.37516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882883.37538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882883.37561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882883.37739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882883.40201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882883.40496: stderr chunk (state=3): >>><<< 33932 1726882883.40503: stdout chunk (state=3): >>><<< 33932 1726882883.40542: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882883.40571: _low_level_execute_command(): starting 33932 1726882883.40608: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738/AnsiballZ_setup.py && sleep 0' 33932 1726882883.42174: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882883.42190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.42205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.42224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.42276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.42400: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882883.42416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.42435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882883.42456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882883.42488: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882883.42502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882883.42523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882883.42619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882883.42633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882883.42645: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882883.42659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882883.42757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882883.42787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882883.42818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882883.42957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882884.09460: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=",<<< 33932 1726882884.09494: stdout chunk (state=3): >>> "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "23", "epoch": "1726882883", "epoch_int": "1726882883", "date": "2024-09-20", "time": "21:41:23", "iso8601_micro": "2024-09-21T01:41:23.808141Z", "iso8601": "2024-09-21T01:41:23Z", "iso8601_basic": "20240920T214123808141", "iso8601_basic_short": "20240920T214123", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible<<< 33932 1726882884.09551: stdout chunk (state=3): >>>_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2785, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 747, "free": 2785}, "nocache": {"free": 3250, "used": 282}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1041, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234377216, "block_size": 4096, "block_total": 65519355, "block_available": 64510346, "block_used": 1009009, "inode_total": 131071472, "inode_available": 130998690, "inode_used": 72782, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_loadavg": {"1m": 0.63, "5m": 0.55, "15m": 0.33}, "ansible_fibre_channel_wwn": [], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 33932 1726882884.11897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882884.11901: stdout chunk (state=3): >>><<< 33932 1726882884.11903: stderr chunk (state=3): >>><<< 33932 1726882884.12066: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "41", "second": "23", "epoch": "1726882883", "epoch_int": "1726882883", "date": "2024-09-20", "time": "21:41:23", "iso8601_micro": "2024-09-21T01:41:23.808141Z", "iso8601": "2024-09-21T01:41:23Z", "iso8601_basic": "20240920T214123808141", "iso8601_basic_short": "20240920T214123", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2785, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 747, "free": 2785}, "nocache": {"free": 3250, "used": 282}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1041, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234377216, "block_size": 4096, "block_total": 65519355, "block_available": 64510346, "block_used": 1009009, "inode_total": 131071472, "inode_available": 130998690, "inode_used": 72782, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_loadavg": {"1m": 0.63, "5m": 0.55, "15m": 0.33}, "ansible_fibre_channel_wwn": [], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882884.13009: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882884.13033: _low_level_execute_command(): starting 33932 1726882884.13041: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882883.1990604-34130-229506355445738/ > /dev/null 2>&1 && sleep 0' 33932 1726882884.14090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882884.14453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.14456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.14490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882884.14493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.14519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.14599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882884.14602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882884.14728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882884.17139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882884.17259: stderr chunk (state=3): >>><<< 33932 1726882884.17263: stdout chunk (state=3): >>><<< 33932 1726882884.17305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882884.17640: handler run complete 33932 1726882884.18360: variable 'ansible_facts' from source: unknown 33932 1726882884.18371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.19002: variable 'ansible_facts' from source: unknown 33932 1726882884.19095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.19228: attempt loop complete, returning result 33932 1726882884.19238: _execute() done 33932 1726882884.19245: dumping result to json 33932 1726882884.19287: done dumping result, returning 33932 1726882884.19301: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-615b-5c48-00000000010b] 33932 1726882884.19311: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000010b ok: [managed_node1] 33932 1726882884.19993: no more pending results, returning what we have 33932 1726882884.19997: results queue empty 33932 1726882884.19998: checking for any_errors_fatal 33932 1726882884.19999: done checking for any_errors_fatal 33932 1726882884.20000: checking for max_fail_percentage 33932 1726882884.20001: done checking for max_fail_percentage 33932 1726882884.20002: checking to see if all hosts have failed and the running result is not ok 33932 1726882884.20003: done checking to see if all hosts have failed 33932 1726882884.20004: getting the remaining hosts for this loop 33932 1726882884.20006: done getting the remaining hosts for this loop 33932 1726882884.20009: getting the next task for host managed_node1 33932 1726882884.20016: done getting next task for host managed_node1 33932 1726882884.20017: ^ task is: TASK: meta (flush_handlers) 33932 1726882884.20019: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882884.20023: getting variables 33932 1726882884.20025: in VariableManager get_vars() 33932 1726882884.20060: Calling all_inventory to load vars for managed_node1 33932 1726882884.20065: Calling groups_inventory to load vars for managed_node1 33932 1726882884.20068: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.20079: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.20082: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.20085: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.20218: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000010b 33932 1726882884.20221: WORKER PROCESS EXITING 33932 1726882884.20243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.20602: done with get_vars() 33932 1726882884.20611: done getting variables 33932 1726882884.20678: in VariableManager get_vars() 33932 1726882884.20691: Calling all_inventory to load vars for managed_node1 33932 1726882884.20693: Calling groups_inventory to load vars for managed_node1 33932 1726882884.20694: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.20698: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.20700: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.20702: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.20824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.21536: done with get_vars() 33932 1726882884.21548: done queuing things up, now waiting for results queue to drain 33932 1726882884.21550: results queue empty 33932 1726882884.21550: checking for any_errors_fatal 33932 1726882884.21553: done checking for any_errors_fatal 33932 1726882884.21554: checking for max_fail_percentage 33932 1726882884.21555: done checking for max_fail_percentage 33932 1726882884.21560: checking to see if all hosts have failed and the running result is not ok 33932 1726882884.21560: done checking to see if all hosts have failed 33932 1726882884.21561: getting the remaining hosts for this loop 33932 1726882884.21562: done getting the remaining hosts for this loop 33932 1726882884.21566: getting the next task for host managed_node1 33932 1726882884.21570: done getting next task for host managed_node1 33932 1726882884.21572: ^ task is: TASK: Include the task 'show_interfaces.yml' 33932 1726882884.21573: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882884.21575: getting variables 33932 1726882884.21576: in VariableManager get_vars() 33932 1726882884.21588: Calling all_inventory to load vars for managed_node1 33932 1726882884.21590: Calling groups_inventory to load vars for managed_node1 33932 1726882884.21592: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.21596: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.21599: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.21601: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.21730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.21921: done with get_vars() 33932 1726882884.21929: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:10 Friday 20 September 2024 21:41:24 -0400 (0:00:01.119) 0:00:04.687 ****** 33932 1726882884.21996: entering _queue_task() for managed_node1/include_tasks 33932 1726882884.22433: worker is 1 (out of 1 available) 33932 1726882884.22445: exiting _queue_task() for managed_node1/include_tasks 33932 1726882884.22456: done queuing things up, now waiting for results queue to drain 33932 1726882884.22458: waiting for pending results... 33932 1726882884.22710: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 33932 1726882884.22803: in run() - task 0e448fcc-3ce9-615b-5c48-00000000000b 33932 1726882884.22821: variable 'ansible_search_path' from source: unknown 33932 1726882884.22862: calling self._execute() 33932 1726882884.22943: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.22955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.22969: variable 'omit' from source: magic vars 33932 1726882884.23413: variable 'ansible_distribution_major_version' from source: facts 33932 1726882884.23429: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882884.23443: _execute() done 33932 1726882884.23450: dumping result to json 33932 1726882884.23458: done dumping result, returning 33932 1726882884.23469: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-615b-5c48-00000000000b] 33932 1726882884.23481: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000b 33932 1726882884.23602: no more pending results, returning what we have 33932 1726882884.23607: in VariableManager get_vars() 33932 1726882884.23651: Calling all_inventory to load vars for managed_node1 33932 1726882884.23653: Calling groups_inventory to load vars for managed_node1 33932 1726882884.23656: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.23670: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.23674: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.23677: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.23901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.24281: done with get_vars() 33932 1726882884.24287: variable 'ansible_search_path' from source: unknown 33932 1726882884.24301: we have included files to process 33932 1726882884.24302: generating all_blocks data 33932 1726882884.24303: done generating all_blocks data 33932 1726882884.24304: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882884.24305: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882884.24307: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882884.24470: in VariableManager get_vars() 33932 1726882884.24491: done with get_vars() 33932 1726882884.24832: done processing included file 33932 1726882884.24834: iterating over new_blocks loaded from include file 33932 1726882884.24836: in VariableManager get_vars() 33932 1726882884.24855: done with get_vars() 33932 1726882884.24857: filtering new block on tags 33932 1726882884.24875: done filtering new block on tags 33932 1726882884.24878: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 33932 1726882884.24882: extending task lists for all hosts with included blocks 33932 1726882884.25128: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000b 33932 1726882884.25131: WORKER PROCESS EXITING 33932 1726882884.26875: done extending task lists 33932 1726882884.26877: done processing included files 33932 1726882884.26878: results queue empty 33932 1726882884.26879: checking for any_errors_fatal 33932 1726882884.26880: done checking for any_errors_fatal 33932 1726882884.26881: checking for max_fail_percentage 33932 1726882884.26882: done checking for max_fail_percentage 33932 1726882884.26883: checking to see if all hosts have failed and the running result is not ok 33932 1726882884.26883: done checking to see if all hosts have failed 33932 1726882884.26884: getting the remaining hosts for this loop 33932 1726882884.26886: done getting the remaining hosts for this loop 33932 1726882884.26888: getting the next task for host managed_node1 33932 1726882884.26891: done getting next task for host managed_node1 33932 1726882884.26893: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 33932 1726882884.26896: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882884.26898: getting variables 33932 1726882884.26899: in VariableManager get_vars() 33932 1726882884.26913: Calling all_inventory to load vars for managed_node1 33932 1726882884.26915: Calling groups_inventory to load vars for managed_node1 33932 1726882884.26917: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.26922: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.26925: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.26927: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.27067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.27260: done with get_vars() 33932 1726882884.27271: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:41:24 -0400 (0:00:00.053) 0:00:04.740 ****** 33932 1726882884.27337: entering _queue_task() for managed_node1/include_tasks 33932 1726882884.27597: worker is 1 (out of 1 available) 33932 1726882884.27608: exiting _queue_task() for managed_node1/include_tasks 33932 1726882884.27619: done queuing things up, now waiting for results queue to drain 33932 1726882884.27620: waiting for pending results... 33932 1726882884.27860: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 33932 1726882884.27950: in run() - task 0e448fcc-3ce9-615b-5c48-000000000120 33932 1726882884.27972: variable 'ansible_search_path' from source: unknown 33932 1726882884.27979: variable 'ansible_search_path' from source: unknown 33932 1726882884.28017: calling self._execute() 33932 1726882884.28133: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.28145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.28159: variable 'omit' from source: magic vars 33932 1726882884.28513: variable 'ansible_distribution_major_version' from source: facts 33932 1726882884.28531: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882884.28541: _execute() done 33932 1726882884.28548: dumping result to json 33932 1726882884.28556: done dumping result, returning 33932 1726882884.28567: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-615b-5c48-000000000120] 33932 1726882884.28578: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000120 33932 1726882884.28678: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000120 33932 1726882884.28686: WORKER PROCESS EXITING 33932 1726882884.28736: no more pending results, returning what we have 33932 1726882884.28742: in VariableManager get_vars() 33932 1726882884.28788: Calling all_inventory to load vars for managed_node1 33932 1726882884.28792: Calling groups_inventory to load vars for managed_node1 33932 1726882884.28820: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.28834: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.28838: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.28841: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.29014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.29213: done with get_vars() 33932 1726882884.29220: variable 'ansible_search_path' from source: unknown 33932 1726882884.29221: variable 'ansible_search_path' from source: unknown 33932 1726882884.29260: we have included files to process 33932 1726882884.29261: generating all_blocks data 33932 1726882884.29263: done generating all_blocks data 33932 1726882884.29266: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882884.29267: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882884.29269: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882884.29765: done processing included file 33932 1726882884.29767: iterating over new_blocks loaded from include file 33932 1726882884.29768: in VariableManager get_vars() 33932 1726882884.29786: done with get_vars() 33932 1726882884.29788: filtering new block on tags 33932 1726882884.29805: done filtering new block on tags 33932 1726882884.29807: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 33932 1726882884.29811: extending task lists for all hosts with included blocks 33932 1726882884.29909: done extending task lists 33932 1726882884.29910: done processing included files 33932 1726882884.29911: results queue empty 33932 1726882884.29912: checking for any_errors_fatal 33932 1726882884.29916: done checking for any_errors_fatal 33932 1726882884.29916: checking for max_fail_percentage 33932 1726882884.29917: done checking for max_fail_percentage 33932 1726882884.29918: checking to see if all hosts have failed and the running result is not ok 33932 1726882884.29919: done checking to see if all hosts have failed 33932 1726882884.29920: getting the remaining hosts for this loop 33932 1726882884.29921: done getting the remaining hosts for this loop 33932 1726882884.29923: getting the next task for host managed_node1 33932 1726882884.29927: done getting next task for host managed_node1 33932 1726882884.29928: ^ task is: TASK: Gather current interface info 33932 1726882884.29931: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882884.29933: getting variables 33932 1726882884.29934: in VariableManager get_vars() 33932 1726882884.29945: Calling all_inventory to load vars for managed_node1 33932 1726882884.29947: Calling groups_inventory to load vars for managed_node1 33932 1726882884.29949: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.29954: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.29956: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.29958: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.30109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.30297: done with get_vars() 33932 1726882884.30306: done getting variables 33932 1726882884.30343: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:41:24 -0400 (0:00:00.030) 0:00:04.771 ****** 33932 1726882884.30372: entering _queue_task() for managed_node1/command 33932 1726882884.30594: worker is 1 (out of 1 available) 33932 1726882884.30605: exiting _queue_task() for managed_node1/command 33932 1726882884.30616: done queuing things up, now waiting for results queue to drain 33932 1726882884.30618: waiting for pending results... 33932 1726882884.30872: running TaskExecutor() for managed_node1/TASK: Gather current interface info 33932 1726882884.30940: in run() - task 0e448fcc-3ce9-615b-5c48-0000000001ff 33932 1726882884.30950: variable 'ansible_search_path' from source: unknown 33932 1726882884.30953: variable 'ansible_search_path' from source: unknown 33932 1726882884.30988: calling self._execute() 33932 1726882884.31053: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.31057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.31070: variable 'omit' from source: magic vars 33932 1726882884.31321: variable 'ansible_distribution_major_version' from source: facts 33932 1726882884.31333: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882884.31338: variable 'omit' from source: magic vars 33932 1726882884.31367: variable 'omit' from source: magic vars 33932 1726882884.31391: variable 'omit' from source: magic vars 33932 1726882884.31423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882884.31450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882884.31466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882884.31482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882884.31491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882884.31513: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882884.31516: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.31520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.31591: Set connection var ansible_shell_executable to /bin/sh 33932 1726882884.31597: Set connection var ansible_timeout to 10 33932 1726882884.31602: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882884.31608: Set connection var ansible_pipelining to False 33932 1726882884.31611: Set connection var ansible_connection to ssh 33932 1726882884.31614: Set connection var ansible_shell_type to sh 33932 1726882884.31630: variable 'ansible_shell_executable' from source: unknown 33932 1726882884.31633: variable 'ansible_connection' from source: unknown 33932 1726882884.31636: variable 'ansible_module_compression' from source: unknown 33932 1726882884.31638: variable 'ansible_shell_type' from source: unknown 33932 1726882884.31642: variable 'ansible_shell_executable' from source: unknown 33932 1726882884.31644: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.31646: variable 'ansible_pipelining' from source: unknown 33932 1726882884.31648: variable 'ansible_timeout' from source: unknown 33932 1726882884.31652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.31744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882884.31752: variable 'omit' from source: magic vars 33932 1726882884.31757: starting attempt loop 33932 1726882884.31762: running the handler 33932 1726882884.31775: _low_level_execute_command(): starting 33932 1726882884.31787: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882884.32281: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.32297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.32311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.32334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.32375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882884.32388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882884.32500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882884.34839: stdout chunk (state=3): >>>/root <<< 33932 1726882884.34989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882884.35034: stderr chunk (state=3): >>><<< 33932 1726882884.35045: stdout chunk (state=3): >>><<< 33932 1726882884.35070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882884.35081: _low_level_execute_command(): starting 33932 1726882884.35087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354 `" && echo ansible-tmp-1726882884.3506827-34186-245160816454354="` echo /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354 `" ) && sleep 0' 33932 1726882884.35754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882884.35766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882884.35775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.35791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.35827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882884.35833: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882884.35843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.35881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882884.35884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882884.35893: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882884.35905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882884.35919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.35935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.35947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882884.35971: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882884.35986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.36061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882884.36090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882884.36105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882884.36229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33932 1726882884.38888: stdout chunk (state=3): >>>ansible-tmp-1726882884.3506827-34186-245160816454354=/root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354 <<< 33932 1726882884.39040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882884.39102: stderr chunk (state=3): >>><<< 33932 1726882884.39105: stdout chunk (state=3): >>><<< 33932 1726882884.39471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882884.3506827-34186-245160816454354=/root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33932 1726882884.39475: variable 'ansible_module_compression' from source: unknown 33932 1726882884.39478: ANSIBALLZ: Using generic lock for ansible.legacy.command 33932 1726882884.39480: ANSIBALLZ: Acquiring lock 33932 1726882884.39482: ANSIBALLZ: Lock acquired: 140301144901104 33932 1726882884.39484: ANSIBALLZ: Creating module 33932 1726882884.52950: ANSIBALLZ: Writing module into payload 33932 1726882884.53066: ANSIBALLZ: Writing module 33932 1726882884.53093: ANSIBALLZ: Renaming module 33932 1726882884.53102: ANSIBALLZ: Done creating module 33932 1726882884.53119: variable 'ansible_facts' from source: unknown 33932 1726882884.53181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354/AnsiballZ_command.py 33932 1726882884.53329: Sending initial data 33932 1726882884.53332: Sent initial data (156 bytes) 33932 1726882884.54259: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882884.54279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882884.54296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.54314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.54353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882884.54371: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882884.54387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.54404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882884.54415: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882884.54424: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882884.54435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882884.54447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.54460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.54480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882884.54490: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882884.54502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.54583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882884.54605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882884.54620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882884.54740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882884.56495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882884.56588: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882884.56693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp6ir9d7aq /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354/AnsiballZ_command.py <<< 33932 1726882884.56798: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882884.58180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882884.58317: stderr chunk (state=3): >>><<< 33932 1726882884.58320: stdout chunk (state=3): >>><<< 33932 1726882884.58344: done transferring module to remote 33932 1726882884.58357: _low_level_execute_command(): starting 33932 1726882884.58360: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354/ /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354/AnsiballZ_command.py && sleep 0' 33932 1726882884.60496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882884.60521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882884.60538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.60557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.60604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882884.60620: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882884.60638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.60656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882884.60676: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882884.60690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882884.60703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882884.60717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.60735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.60750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882884.60783: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882884.60797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.60881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882884.60905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882884.60921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882884.61051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882884.62902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882884.62905: stdout chunk (state=3): >>><<< 33932 1726882884.62907: stderr chunk (state=3): >>><<< 33932 1726882884.63001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882884.63007: _low_level_execute_command(): starting 33932 1726882884.63010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354/AnsiballZ_command.py && sleep 0' 33932 1726882884.64481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882884.64488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.64505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.64541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.64547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.64587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.64661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882884.64667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882884.64685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882884.64807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882884.78190: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:24.777204", "end": "2024-09-20 21:41:24.780509", "delta": "0:00:00.003305", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882884.79409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882884.79412: stdout chunk (state=3): >>><<< 33932 1726882884.79415: stderr chunk (state=3): >>><<< 33932 1726882884.79565: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:24.777204", "end": "2024-09-20 21:41:24.780509", "delta": "0:00:00.003305", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882884.79572: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882884.79579: _low_level_execute_command(): starting 33932 1726882884.79581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882884.3506827-34186-245160816454354/ > /dev/null 2>&1 && sleep 0' 33932 1726882884.81246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882884.81250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882884.81288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882884.81292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882884.81294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882884.81362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882884.81366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882884.81373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882884.81499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882884.83287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882884.83349: stderr chunk (state=3): >>><<< 33932 1726882884.83352: stdout chunk (state=3): >>><<< 33932 1726882884.83476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882884.83480: handler run complete 33932 1726882884.83483: Evaluated conditional (False): False 33932 1726882884.83485: attempt loop complete, returning result 33932 1726882884.83487: _execute() done 33932 1726882884.83489: dumping result to json 33932 1726882884.83490: done dumping result, returning 33932 1726882884.83492: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-615b-5c48-0000000001ff] 33932 1726882884.83494: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000001ff 33932 1726882884.83654: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000001ff 33932 1726882884.83658: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003305", "end": "2024-09-20 21:41:24.780509", "rc": 0, "start": "2024-09-20 21:41:24.777204" } STDOUT: bonding_masters eth0 lo 33932 1726882884.83763: no more pending results, returning what we have 33932 1726882884.83770: results queue empty 33932 1726882884.83772: checking for any_errors_fatal 33932 1726882884.83773: done checking for any_errors_fatal 33932 1726882884.83774: checking for max_fail_percentage 33932 1726882884.83776: done checking for max_fail_percentage 33932 1726882884.83777: checking to see if all hosts have failed and the running result is not ok 33932 1726882884.83777: done checking to see if all hosts have failed 33932 1726882884.83778: getting the remaining hosts for this loop 33932 1726882884.83780: done getting the remaining hosts for this loop 33932 1726882884.83784: getting the next task for host managed_node1 33932 1726882884.83790: done getting next task for host managed_node1 33932 1726882884.83792: ^ task is: TASK: Set current_interfaces 33932 1726882884.83796: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882884.83800: getting variables 33932 1726882884.83801: in VariableManager get_vars() 33932 1726882884.83841: Calling all_inventory to load vars for managed_node1 33932 1726882884.83844: Calling groups_inventory to load vars for managed_node1 33932 1726882884.83847: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.83858: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.83861: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.83872: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.84226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.84577: done with get_vars() 33932 1726882884.84588: done getting variables 33932 1726882884.84645: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:41:24 -0400 (0:00:00.544) 0:00:05.315 ****** 33932 1726882884.84794: entering _queue_task() for managed_node1/set_fact 33932 1726882884.85301: worker is 1 (out of 1 available) 33932 1726882884.85312: exiting _queue_task() for managed_node1/set_fact 33932 1726882884.85437: done queuing things up, now waiting for results queue to drain 33932 1726882884.85439: waiting for pending results... 33932 1726882884.86077: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 33932 1726882884.86156: in run() - task 0e448fcc-3ce9-615b-5c48-000000000200 33932 1726882884.86356: variable 'ansible_search_path' from source: unknown 33932 1726882884.86361: variable 'ansible_search_path' from source: unknown 33932 1726882884.86588: calling self._execute() 33932 1726882884.86658: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.86662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.86673: variable 'omit' from source: magic vars 33932 1726882884.87203: variable 'ansible_distribution_major_version' from source: facts 33932 1726882884.87215: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882884.87221: variable 'omit' from source: magic vars 33932 1726882884.87261: variable 'omit' from source: magic vars 33932 1726882884.87631: variable '_current_interfaces' from source: set_fact 33932 1726882884.87693: variable 'omit' from source: magic vars 33932 1726882884.87731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882884.88038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882884.88192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882884.88216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882884.88232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882884.88270: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882884.88399: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.88407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.88513: Set connection var ansible_shell_executable to /bin/sh 33932 1726882884.88576: Set connection var ansible_timeout to 10 33932 1726882884.88617: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882884.88628: Set connection var ansible_pipelining to False 33932 1726882884.88732: Set connection var ansible_connection to ssh 33932 1726882884.88745: Set connection var ansible_shell_type to sh 33932 1726882884.88776: variable 'ansible_shell_executable' from source: unknown 33932 1726882884.88785: variable 'ansible_connection' from source: unknown 33932 1726882884.88792: variable 'ansible_module_compression' from source: unknown 33932 1726882884.88798: variable 'ansible_shell_type' from source: unknown 33932 1726882884.88804: variable 'ansible_shell_executable' from source: unknown 33932 1726882884.88811: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.88818: variable 'ansible_pipelining' from source: unknown 33932 1726882884.88824: variable 'ansible_timeout' from source: unknown 33932 1726882884.88831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.89202: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882884.89217: variable 'omit' from source: magic vars 33932 1726882884.89226: starting attempt loop 33932 1726882884.89233: running the handler 33932 1726882884.89248: handler run complete 33932 1726882884.89273: attempt loop complete, returning result 33932 1726882884.89280: _execute() done 33932 1726882884.89287: dumping result to json 33932 1726882884.89294: done dumping result, returning 33932 1726882884.89314: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-615b-5c48-000000000200] 33932 1726882884.89358: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000200 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 33932 1726882884.89529: no more pending results, returning what we have 33932 1726882884.89532: results queue empty 33932 1726882884.89533: checking for any_errors_fatal 33932 1726882884.89542: done checking for any_errors_fatal 33932 1726882884.89543: checking for max_fail_percentage 33932 1726882884.89545: done checking for max_fail_percentage 33932 1726882884.89546: checking to see if all hosts have failed and the running result is not ok 33932 1726882884.89546: done checking to see if all hosts have failed 33932 1726882884.89547: getting the remaining hosts for this loop 33932 1726882884.89549: done getting the remaining hosts for this loop 33932 1726882884.89553: getting the next task for host managed_node1 33932 1726882884.89564: done getting next task for host managed_node1 33932 1726882884.89566: ^ task is: TASK: Show current_interfaces 33932 1726882884.89573: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882884.89577: getting variables 33932 1726882884.89578: in VariableManager get_vars() 33932 1726882884.89616: Calling all_inventory to load vars for managed_node1 33932 1726882884.89619: Calling groups_inventory to load vars for managed_node1 33932 1726882884.89621: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.89633: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.89636: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.89640: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.89847: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000200 33932 1726882884.89851: WORKER PROCESS EXITING 33932 1726882884.89866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.90057: done with get_vars() 33932 1726882884.90078: done getting variables 33932 1726882884.90281: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:41:24 -0400 (0:00:00.055) 0:00:05.370 ****** 33932 1726882884.90313: entering _queue_task() for managed_node1/debug 33932 1726882884.90315: Creating lock for debug 33932 1726882884.90788: worker is 1 (out of 1 available) 33932 1726882884.90801: exiting _queue_task() for managed_node1/debug 33932 1726882884.90811: done queuing things up, now waiting for results queue to drain 33932 1726882884.90813: waiting for pending results... 33932 1726882884.91572: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 33932 1726882884.91920: in run() - task 0e448fcc-3ce9-615b-5c48-000000000121 33932 1726882884.92162: variable 'ansible_search_path' from source: unknown 33932 1726882884.92173: variable 'ansible_search_path' from source: unknown 33932 1726882884.92227: calling self._execute() 33932 1726882884.92305: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.92315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.92328: variable 'omit' from source: magic vars 33932 1726882884.93104: variable 'ansible_distribution_major_version' from source: facts 33932 1726882884.93186: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882884.93200: variable 'omit' from source: magic vars 33932 1726882884.93240: variable 'omit' from source: magic vars 33932 1726882884.93461: variable 'current_interfaces' from source: set_fact 33932 1726882884.93502: variable 'omit' from source: magic vars 33932 1726882884.93607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882884.93703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882884.93724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882884.93745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882884.93884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882884.93913: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882884.93921: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.93928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.94024: Set connection var ansible_shell_executable to /bin/sh 33932 1726882884.94090: Set connection var ansible_timeout to 10 33932 1726882884.94099: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882884.94161: Set connection var ansible_pipelining to False 33932 1726882884.94171: Set connection var ansible_connection to ssh 33932 1726882884.94178: Set connection var ansible_shell_type to sh 33932 1726882884.94206: variable 'ansible_shell_executable' from source: unknown 33932 1726882884.94214: variable 'ansible_connection' from source: unknown 33932 1726882884.94221: variable 'ansible_module_compression' from source: unknown 33932 1726882884.94275: variable 'ansible_shell_type' from source: unknown 33932 1726882884.94284: variable 'ansible_shell_executable' from source: unknown 33932 1726882884.94291: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.94298: variable 'ansible_pipelining' from source: unknown 33932 1726882884.94304: variable 'ansible_timeout' from source: unknown 33932 1726882884.94311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.94436: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882884.94581: variable 'omit' from source: magic vars 33932 1726882884.94591: starting attempt loop 33932 1726882884.94597: running the handler 33932 1726882884.94814: handler run complete 33932 1726882884.94835: attempt loop complete, returning result 33932 1726882884.94842: _execute() done 33932 1726882884.94849: dumping result to json 33932 1726882884.94855: done dumping result, returning 33932 1726882884.94868: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-615b-5c48-000000000121] 33932 1726882884.94880: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000121 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 33932 1726882884.95011: no more pending results, returning what we have 33932 1726882884.95014: results queue empty 33932 1726882884.95015: checking for any_errors_fatal 33932 1726882884.95020: done checking for any_errors_fatal 33932 1726882884.95021: checking for max_fail_percentage 33932 1726882884.95022: done checking for max_fail_percentage 33932 1726882884.95023: checking to see if all hosts have failed and the running result is not ok 33932 1726882884.95024: done checking to see if all hosts have failed 33932 1726882884.95025: getting the remaining hosts for this loop 33932 1726882884.95027: done getting the remaining hosts for this loop 33932 1726882884.95030: getting the next task for host managed_node1 33932 1726882884.95037: done getting next task for host managed_node1 33932 1726882884.95041: ^ task is: TASK: Include the task 'manage_test_interface.yml' 33932 1726882884.95043: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882884.95046: getting variables 33932 1726882884.95048: in VariableManager get_vars() 33932 1726882884.95089: Calling all_inventory to load vars for managed_node1 33932 1726882884.95092: Calling groups_inventory to load vars for managed_node1 33932 1726882884.95094: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.95105: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.95107: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.95110: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.95277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.95481: done with get_vars() 33932 1726882884.95492: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:12 Friday 20 September 2024 21:41:24 -0400 (0:00:00.052) 0:00:05.423 ****** 33932 1726882884.95579: entering _queue_task() for managed_node1/include_tasks 33932 1726882884.95597: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000121 33932 1726882884.95606: WORKER PROCESS EXITING 33932 1726882884.96007: worker is 1 (out of 1 available) 33932 1726882884.96019: exiting _queue_task() for managed_node1/include_tasks 33932 1726882884.96030: done queuing things up, now waiting for results queue to drain 33932 1726882884.96032: waiting for pending results... 33932 1726882884.96799: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 33932 1726882884.96999: in run() - task 0e448fcc-3ce9-615b-5c48-00000000000c 33932 1726882884.97121: variable 'ansible_search_path' from source: unknown 33932 1726882884.97170: calling self._execute() 33932 1726882884.97337: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882884.97470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882884.97486: variable 'omit' from source: magic vars 33932 1726882884.98208: variable 'ansible_distribution_major_version' from source: facts 33932 1726882884.98340: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882884.98369: _execute() done 33932 1726882884.98378: dumping result to json 33932 1726882884.98386: done dumping result, returning 33932 1726882884.98395: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-615b-5c48-00000000000c] 33932 1726882884.98405: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000c 33932 1726882884.98561: no more pending results, returning what we have 33932 1726882884.98568: in VariableManager get_vars() 33932 1726882884.98612: Calling all_inventory to load vars for managed_node1 33932 1726882884.98615: Calling groups_inventory to load vars for managed_node1 33932 1726882884.98617: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882884.98630: Calling all_plugins_play to load vars for managed_node1 33932 1726882884.98633: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882884.98636: Calling groups_plugins_play to load vars for managed_node1 33932 1726882884.98862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882884.99030: done with get_vars() 33932 1726882884.99037: variable 'ansible_search_path' from source: unknown 33932 1726882884.99049: we have included files to process 33932 1726882884.99050: generating all_blocks data 33932 1726882884.99051: done generating all_blocks data 33932 1726882884.99057: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 33932 1726882884.99058: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 33932 1726882884.99060: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 33932 1726882884.99750: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000c 33932 1726882884.99753: WORKER PROCESS EXITING 33932 1726882885.00414: in VariableManager get_vars() 33932 1726882885.00436: done with get_vars() 33932 1726882885.00668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 33932 1726882885.01892: done processing included file 33932 1726882885.01895: iterating over new_blocks loaded from include file 33932 1726882885.01896: in VariableManager get_vars() 33932 1726882885.01914: done with get_vars() 33932 1726882885.01916: filtering new block on tags 33932 1726882885.01948: done filtering new block on tags 33932 1726882885.01951: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 33932 1726882885.01955: extending task lists for all hosts with included blocks 33932 1726882885.06022: done extending task lists 33932 1726882885.06024: done processing included files 33932 1726882885.06025: results queue empty 33932 1726882885.06026: checking for any_errors_fatal 33932 1726882885.06029: done checking for any_errors_fatal 33932 1726882885.06030: checking for max_fail_percentage 33932 1726882885.06031: done checking for max_fail_percentage 33932 1726882885.06032: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.06033: done checking to see if all hosts have failed 33932 1726882885.06033: getting the remaining hosts for this loop 33932 1726882885.06035: done getting the remaining hosts for this loop 33932 1726882885.06038: getting the next task for host managed_node1 33932 1726882885.06042: done getting next task for host managed_node1 33932 1726882885.06044: ^ task is: TASK: Ensure state in ["present", "absent"] 33932 1726882885.06047: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.06049: getting variables 33932 1726882885.06050: in VariableManager get_vars() 33932 1726882885.06069: Calling all_inventory to load vars for managed_node1 33932 1726882885.06146: Calling groups_inventory to load vars for managed_node1 33932 1726882885.06149: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.06155: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.06158: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.06160: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.06307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.06488: done with get_vars() 33932 1726882885.06498: done getting variables 33932 1726882885.06562: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:41:25 -0400 (0:00:00.110) 0:00:05.533 ****** 33932 1726882885.06588: entering _queue_task() for managed_node1/fail 33932 1726882885.06590: Creating lock for fail 33932 1726882885.07666: worker is 1 (out of 1 available) 33932 1726882885.07678: exiting _queue_task() for managed_node1/fail 33932 1726882885.07690: done queuing things up, now waiting for results queue to drain 33932 1726882885.07692: waiting for pending results... 33932 1726882885.08597: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 33932 1726882885.08686: in run() - task 0e448fcc-3ce9-615b-5c48-00000000021b 33932 1726882885.08839: variable 'ansible_search_path' from source: unknown 33932 1726882885.08847: variable 'ansible_search_path' from source: unknown 33932 1726882885.08889: calling self._execute() 33932 1726882885.09115: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.09126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.09137: variable 'omit' from source: magic vars 33932 1726882885.09852: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.09872: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.10234: variable 'state' from source: include params 33932 1726882885.10245: Evaluated conditional (state not in ["present", "absent"]): False 33932 1726882885.10252: when evaluation is False, skipping this task 33932 1726882885.10258: _execute() done 33932 1726882885.10266: dumping result to json 33932 1726882885.10273: done dumping result, returning 33932 1726882885.10283: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-615b-5c48-00000000021b] 33932 1726882885.10294: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021b skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 33932 1726882885.10430: no more pending results, returning what we have 33932 1726882885.10434: results queue empty 33932 1726882885.10435: checking for any_errors_fatal 33932 1726882885.10437: done checking for any_errors_fatal 33932 1726882885.10438: checking for max_fail_percentage 33932 1726882885.10440: done checking for max_fail_percentage 33932 1726882885.10440: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.10441: done checking to see if all hosts have failed 33932 1726882885.10442: getting the remaining hosts for this loop 33932 1726882885.10444: done getting the remaining hosts for this loop 33932 1726882885.10447: getting the next task for host managed_node1 33932 1726882885.10453: done getting next task for host managed_node1 33932 1726882885.10456: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 33932 1726882885.10459: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.10462: getting variables 33932 1726882885.10466: in VariableManager get_vars() 33932 1726882885.10532: Calling all_inventory to load vars for managed_node1 33932 1726882885.10535: Calling groups_inventory to load vars for managed_node1 33932 1726882885.10538: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.10551: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.10554: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.10557: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.10735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.10940: done with get_vars() 33932 1726882885.10950: done getting variables 33932 1726882885.11023: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:41:25 -0400 (0:00:00.044) 0:00:05.578 ****** 33932 1726882885.11054: entering _queue_task() for managed_node1/fail 33932 1726882885.11072: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021b 33932 1726882885.11082: WORKER PROCESS EXITING 33932 1726882885.12003: worker is 1 (out of 1 available) 33932 1726882885.12016: exiting _queue_task() for managed_node1/fail 33932 1726882885.12027: done queuing things up, now waiting for results queue to drain 33932 1726882885.12029: waiting for pending results... 33932 1726882885.12825: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 33932 1726882885.12934: in run() - task 0e448fcc-3ce9-615b-5c48-00000000021c 33932 1726882885.12960: variable 'ansible_search_path' from source: unknown 33932 1726882885.12975: variable 'ansible_search_path' from source: unknown 33932 1726882885.13018: calling self._execute() 33932 1726882885.13113: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.13125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.13139: variable 'omit' from source: magic vars 33932 1726882885.13531: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.13556: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.13718: variable 'type' from source: play vars 33932 1726882885.13731: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 33932 1726882885.13738: when evaluation is False, skipping this task 33932 1726882885.13744: _execute() done 33932 1726882885.13753: dumping result to json 33932 1726882885.13766: done dumping result, returning 33932 1726882885.13778: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-615b-5c48-00000000021c] 33932 1726882885.13789: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021c skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 33932 1726882885.13939: no more pending results, returning what we have 33932 1726882885.13943: results queue empty 33932 1726882885.13944: checking for any_errors_fatal 33932 1726882885.13949: done checking for any_errors_fatal 33932 1726882885.13949: checking for max_fail_percentage 33932 1726882885.13951: done checking for max_fail_percentage 33932 1726882885.13951: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.13952: done checking to see if all hosts have failed 33932 1726882885.13953: getting the remaining hosts for this loop 33932 1726882885.13954: done getting the remaining hosts for this loop 33932 1726882885.13958: getting the next task for host managed_node1 33932 1726882885.13966: done getting next task for host managed_node1 33932 1726882885.13969: ^ task is: TASK: Include the task 'show_interfaces.yml' 33932 1726882885.13972: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.13975: getting variables 33932 1726882885.13977: in VariableManager get_vars() 33932 1726882885.14015: Calling all_inventory to load vars for managed_node1 33932 1726882885.14018: Calling groups_inventory to load vars for managed_node1 33932 1726882885.14020: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.14035: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.14038: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.14042: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.14185: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021c 33932 1726882885.14189: WORKER PROCESS EXITING 33932 1726882885.14214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.14462: done with get_vars() 33932 1726882885.14477: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:41:25 -0400 (0:00:00.035) 0:00:05.613 ****** 33932 1726882885.14578: entering _queue_task() for managed_node1/include_tasks 33932 1726882885.14971: worker is 1 (out of 1 available) 33932 1726882885.14984: exiting _queue_task() for managed_node1/include_tasks 33932 1726882885.14995: done queuing things up, now waiting for results queue to drain 33932 1726882885.14996: waiting for pending results... 33932 1726882885.15606: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 33932 1726882885.15715: in run() - task 0e448fcc-3ce9-615b-5c48-00000000021d 33932 1726882885.15738: variable 'ansible_search_path' from source: unknown 33932 1726882885.15748: variable 'ansible_search_path' from source: unknown 33932 1726882885.15819: calling self._execute() 33932 1726882885.15918: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.15930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.15943: variable 'omit' from source: magic vars 33932 1726882885.16536: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.16553: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.16627: _execute() done 33932 1726882885.16638: dumping result to json 33932 1726882885.16646: done dumping result, returning 33932 1726882885.16656: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-615b-5c48-00000000021d] 33932 1726882885.16671: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021d 33932 1726882885.16821: no more pending results, returning what we have 33932 1726882885.16827: in VariableManager get_vars() 33932 1726882885.16878: Calling all_inventory to load vars for managed_node1 33932 1726882885.16882: Calling groups_inventory to load vars for managed_node1 33932 1726882885.16884: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.16898: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.16901: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.16904: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.17092: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021d 33932 1726882885.17096: WORKER PROCESS EXITING 33932 1726882885.17102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.17296: done with get_vars() 33932 1726882885.17304: variable 'ansible_search_path' from source: unknown 33932 1726882885.17305: variable 'ansible_search_path' from source: unknown 33932 1726882885.17339: we have included files to process 33932 1726882885.17340: generating all_blocks data 33932 1726882885.17342: done generating all_blocks data 33932 1726882885.17728: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882885.17730: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882885.17733: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882885.17830: in VariableManager get_vars() 33932 1726882885.17852: done with get_vars() 33932 1726882885.17957: done processing included file 33932 1726882885.17959: iterating over new_blocks loaded from include file 33932 1726882885.17960: in VariableManager get_vars() 33932 1726882885.17980: done with get_vars() 33932 1726882885.17981: filtering new block on tags 33932 1726882885.17998: done filtering new block on tags 33932 1726882885.18000: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 33932 1726882885.18004: extending task lists for all hosts with included blocks 33932 1726882885.18392: done extending task lists 33932 1726882885.18393: done processing included files 33932 1726882885.18394: results queue empty 33932 1726882885.18395: checking for any_errors_fatal 33932 1726882885.18398: done checking for any_errors_fatal 33932 1726882885.18399: checking for max_fail_percentage 33932 1726882885.18400: done checking for max_fail_percentage 33932 1726882885.18401: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.18401: done checking to see if all hosts have failed 33932 1726882885.18402: getting the remaining hosts for this loop 33932 1726882885.18403: done getting the remaining hosts for this loop 33932 1726882885.18406: getting the next task for host managed_node1 33932 1726882885.18410: done getting next task for host managed_node1 33932 1726882885.18412: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 33932 1726882885.18414: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.18417: getting variables 33932 1726882885.18417: in VariableManager get_vars() 33932 1726882885.18429: Calling all_inventory to load vars for managed_node1 33932 1726882885.18431: Calling groups_inventory to load vars for managed_node1 33932 1726882885.18433: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.18438: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.18440: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.18443: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.18628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.18834: done with get_vars() 33932 1726882885.18843: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:41:25 -0400 (0:00:00.043) 0:00:05.656 ****** 33932 1726882885.18913: entering _queue_task() for managed_node1/include_tasks 33932 1726882885.19130: worker is 1 (out of 1 available) 33932 1726882885.19141: exiting _queue_task() for managed_node1/include_tasks 33932 1726882885.19152: done queuing things up, now waiting for results queue to drain 33932 1726882885.19154: waiting for pending results... 33932 1726882885.20170: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 33932 1726882885.20321: in run() - task 0e448fcc-3ce9-615b-5c48-000000000314 33932 1726882885.20360: variable 'ansible_search_path' from source: unknown 33932 1726882885.20373: variable 'ansible_search_path' from source: unknown 33932 1726882885.20416: calling self._execute() 33932 1726882885.20562: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.20583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.20597: variable 'omit' from source: magic vars 33932 1726882885.21022: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.21041: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.21052: _execute() done 33932 1726882885.21060: dumping result to json 33932 1726882885.21074: done dumping result, returning 33932 1726882885.21091: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-615b-5c48-000000000314] 33932 1726882885.21103: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000314 33932 1726882885.21226: no more pending results, returning what we have 33932 1726882885.21231: in VariableManager get_vars() 33932 1726882885.21276: Calling all_inventory to load vars for managed_node1 33932 1726882885.21279: Calling groups_inventory to load vars for managed_node1 33932 1726882885.21281: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.21294: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.21296: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.21299: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.21456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.21647: done with get_vars() 33932 1726882885.21654: variable 'ansible_search_path' from source: unknown 33932 1726882885.21655: variable 'ansible_search_path' from source: unknown 33932 1726882885.21726: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000314 33932 1726882885.21729: WORKER PROCESS EXITING 33932 1726882885.21773: we have included files to process 33932 1726882885.21791: generating all_blocks data 33932 1726882885.21792: done generating all_blocks data 33932 1726882885.21829: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882885.21831: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882885.21833: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882885.22197: done processing included file 33932 1726882885.22199: iterating over new_blocks loaded from include file 33932 1726882885.22201: in VariableManager get_vars() 33932 1726882885.22500: done with get_vars() 33932 1726882885.22503: filtering new block on tags 33932 1726882885.22520: done filtering new block on tags 33932 1726882885.22522: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 33932 1726882885.22526: extending task lists for all hosts with included blocks 33932 1726882885.22689: done extending task lists 33932 1726882885.22690: done processing included files 33932 1726882885.22691: results queue empty 33932 1726882885.22692: checking for any_errors_fatal 33932 1726882885.22694: done checking for any_errors_fatal 33932 1726882885.22695: checking for max_fail_percentage 33932 1726882885.22696: done checking for max_fail_percentage 33932 1726882885.22697: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.22698: done checking to see if all hosts have failed 33932 1726882885.22699: getting the remaining hosts for this loop 33932 1726882885.22700: done getting the remaining hosts for this loop 33932 1726882885.22702: getting the next task for host managed_node1 33932 1726882885.22706: done getting next task for host managed_node1 33932 1726882885.22708: ^ task is: TASK: Gather current interface info 33932 1726882885.22712: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.22714: getting variables 33932 1726882885.22715: in VariableManager get_vars() 33932 1726882885.22727: Calling all_inventory to load vars for managed_node1 33932 1726882885.22729: Calling groups_inventory to load vars for managed_node1 33932 1726882885.22731: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.22735: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.22737: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.22740: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.22889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.23100: done with get_vars() 33932 1726882885.23108: done getting variables 33932 1726882885.23145: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:41:25 -0400 (0:00:00.042) 0:00:05.699 ****** 33932 1726882885.23167: entering _queue_task() for managed_node1/command 33932 1726882885.23360: worker is 1 (out of 1 available) 33932 1726882885.23376: exiting _queue_task() for managed_node1/command 33932 1726882885.23387: done queuing things up, now waiting for results queue to drain 33932 1726882885.23388: waiting for pending results... 33932 1726882885.23545: running TaskExecutor() for managed_node1/TASK: Gather current interface info 33932 1726882885.23622: in run() - task 0e448fcc-3ce9-615b-5c48-00000000034b 33932 1726882885.23634: variable 'ansible_search_path' from source: unknown 33932 1726882885.23638: variable 'ansible_search_path' from source: unknown 33932 1726882885.23674: calling self._execute() 33932 1726882885.23740: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.23743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.23753: variable 'omit' from source: magic vars 33932 1726882885.24054: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.24073: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.24085: variable 'omit' from source: magic vars 33932 1726882885.24137: variable 'omit' from source: magic vars 33932 1726882885.24177: variable 'omit' from source: magic vars 33932 1726882885.24223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882885.24262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882885.24289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882885.24317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.24333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.24366: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882885.24378: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.24386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.24492: Set connection var ansible_shell_executable to /bin/sh 33932 1726882885.24505: Set connection var ansible_timeout to 10 33932 1726882885.24514: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882885.24522: Set connection var ansible_pipelining to False 33932 1726882885.24528: Set connection var ansible_connection to ssh 33932 1726882885.24533: Set connection var ansible_shell_type to sh 33932 1726882885.24559: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.24569: variable 'ansible_connection' from source: unknown 33932 1726882885.24579: variable 'ansible_module_compression' from source: unknown 33932 1726882885.24586: variable 'ansible_shell_type' from source: unknown 33932 1726882885.24594: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.24600: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.24607: variable 'ansible_pipelining' from source: unknown 33932 1726882885.24612: variable 'ansible_timeout' from source: unknown 33932 1726882885.24619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.24746: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882885.24761: variable 'omit' from source: magic vars 33932 1726882885.24774: starting attempt loop 33932 1726882885.24781: running the handler 33932 1726882885.24804: _low_level_execute_command(): starting 33932 1726882885.24818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882885.26057: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882885.26129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.26145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.26168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.26210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.26223: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882885.26237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.26256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882885.26270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882885.26283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882885.26295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.26309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.26325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.26338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.26362: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882885.26380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.26456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.26483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882885.26501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.26628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.28302: stdout chunk (state=3): >>>/root <<< 33932 1726882885.28476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882885.28483: stdout chunk (state=3): >>><<< 33932 1726882885.28491: stderr chunk (state=3): >>><<< 33932 1726882885.28514: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882885.28527: _low_level_execute_command(): starting 33932 1726882885.28532: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073 `" && echo ansible-tmp-1726882885.2851293-34250-23740187554073="` echo /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073 `" ) && sleep 0' 33932 1726882885.29163: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882885.29176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.29192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.29201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.29251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.29255: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882885.29270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.29285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882885.29304: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882885.29307: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882885.29309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.29311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.29337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.29632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.29636: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882885.29638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.29640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.29642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882885.29643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.29645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.31480: stdout chunk (state=3): >>>ansible-tmp-1726882885.2851293-34250-23740187554073=/root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073 <<< 33932 1726882885.31603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882885.31628: stderr chunk (state=3): >>><<< 33932 1726882885.31631: stdout chunk (state=3): >>><<< 33932 1726882885.31644: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882885.2851293-34250-23740187554073=/root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882885.31695: variable 'ansible_module_compression' from source: unknown 33932 1726882885.31729: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882885.31756: variable 'ansible_facts' from source: unknown 33932 1726882885.31830: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073/AnsiballZ_command.py 33932 1726882885.32157: Sending initial data 33932 1726882885.32161: Sent initial data (155 bytes) 33932 1726882885.32884: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882885.32887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.32897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.32910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.33043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.33099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.33102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882885.33128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.33504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.34947: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882885.35034: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882885.35123: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp6bsa58m1 /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073/AnsiballZ_command.py <<< 33932 1726882885.35211: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882885.36282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882885.36342: stderr chunk (state=3): >>><<< 33932 1726882885.36345: stdout chunk (state=3): >>><<< 33932 1726882885.36357: done transferring module to remote 33932 1726882885.36369: _low_level_execute_command(): starting 33932 1726882885.36377: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073/ /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073/AnsiballZ_command.py && sleep 0' 33932 1726882885.37014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.37020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.37046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.37053: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882885.37061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.37086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882885.37089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.37093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.37144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.37147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.37248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.39005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882885.39083: stderr chunk (state=3): >>><<< 33932 1726882885.39090: stdout chunk (state=3): >>><<< 33932 1726882885.39124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882885.39127: _low_level_execute_command(): starting 33932 1726882885.39129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073/AnsiballZ_command.py && sleep 0' 33932 1726882885.39843: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882885.39858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.39877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.39886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.39967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.39973: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.39976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882885.39978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.39980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882885.39999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.40051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.40055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882885.40066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.40190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.54241: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:25.537757", "end": "2024-09-20 21:41:25.541089", "delta": "0:00:00.003332", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882885.55535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882885.55539: stdout chunk (state=3): >>><<< 33932 1726882885.55542: stderr chunk (state=3): >>><<< 33932 1726882885.55707: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:25.537757", "end": "2024-09-20 21:41:25.541089", "delta": "0:00:00.003332", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882885.55711: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882885.55769: _low_level_execute_command(): starting 33932 1726882885.55772: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882885.2851293-34250-23740187554073/ > /dev/null 2>&1 && sleep 0' 33932 1726882885.57940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882885.57954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.57980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.58000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.58043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.58055: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882885.58073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.58095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882885.58106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882885.58115: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882885.58126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.58138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.58152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.58162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.58177: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882885.58192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.58262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.58290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882885.58313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.58440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.60339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882885.60342: stdout chunk (state=3): >>><<< 33932 1726882885.60344: stderr chunk (state=3): >>><<< 33932 1726882885.60570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882885.60573: handler run complete 33932 1726882885.60576: Evaluated conditional (False): False 33932 1726882885.60578: attempt loop complete, returning result 33932 1726882885.60580: _execute() done 33932 1726882885.60582: dumping result to json 33932 1726882885.60584: done dumping result, returning 33932 1726882885.60586: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-615b-5c48-00000000034b] 33932 1726882885.60588: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000034b 33932 1726882885.60658: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000034b 33932 1726882885.60661: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003332", "end": "2024-09-20 21:41:25.541089", "rc": 0, "start": "2024-09-20 21:41:25.537757" } STDOUT: bonding_masters eth0 lo 33932 1726882885.60743: no more pending results, returning what we have 33932 1726882885.60746: results queue empty 33932 1726882885.60747: checking for any_errors_fatal 33932 1726882885.60749: done checking for any_errors_fatal 33932 1726882885.60750: checking for max_fail_percentage 33932 1726882885.60752: done checking for max_fail_percentage 33932 1726882885.60753: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.60753: done checking to see if all hosts have failed 33932 1726882885.60754: getting the remaining hosts for this loop 33932 1726882885.60756: done getting the remaining hosts for this loop 33932 1726882885.60760: getting the next task for host managed_node1 33932 1726882885.60769: done getting next task for host managed_node1 33932 1726882885.60772: ^ task is: TASK: Set current_interfaces 33932 1726882885.60777: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.60781: getting variables 33932 1726882885.60783: in VariableManager get_vars() 33932 1726882885.60824: Calling all_inventory to load vars for managed_node1 33932 1726882885.60827: Calling groups_inventory to load vars for managed_node1 33932 1726882885.60829: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.60841: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.60845: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.60849: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.61179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.61473: done with get_vars() 33932 1726882885.61485: done getting variables 33932 1726882885.62984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:41:25 -0400 (0:00:00.398) 0:00:06.097 ****** 33932 1726882885.63022: entering _queue_task() for managed_node1/set_fact 33932 1726882885.63262: worker is 1 (out of 1 available) 33932 1726882885.63277: exiting _queue_task() for managed_node1/set_fact 33932 1726882885.63289: done queuing things up, now waiting for results queue to drain 33932 1726882885.63291: waiting for pending results... 33932 1726882885.64133: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 33932 1726882885.64288: in run() - task 0e448fcc-3ce9-615b-5c48-00000000034c 33932 1726882885.64888: variable 'ansible_search_path' from source: unknown 33932 1726882885.64895: variable 'ansible_search_path' from source: unknown 33932 1726882885.64935: calling self._execute() 33932 1726882885.65025: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.65036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.65050: variable 'omit' from source: magic vars 33932 1726882885.65415: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.65488: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.65681: variable 'omit' from source: magic vars 33932 1726882885.65736: variable 'omit' from source: magic vars 33932 1726882885.65846: variable '_current_interfaces' from source: set_fact 33932 1726882885.66517: variable 'omit' from source: magic vars 33932 1726882885.66559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882885.66602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882885.66627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882885.66648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.66662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.66703: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882885.66712: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.66721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.66828: Set connection var ansible_shell_executable to /bin/sh 33932 1726882885.66842: Set connection var ansible_timeout to 10 33932 1726882885.66851: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882885.66858: Set connection var ansible_pipelining to False 33932 1726882885.66866: Set connection var ansible_connection to ssh 33932 1726882885.66875: Set connection var ansible_shell_type to sh 33932 1726882885.66899: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.66905: variable 'ansible_connection' from source: unknown 33932 1726882885.66911: variable 'ansible_module_compression' from source: unknown 33932 1726882885.66916: variable 'ansible_shell_type' from source: unknown 33932 1726882885.66921: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.66926: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.66931: variable 'ansible_pipelining' from source: unknown 33932 1726882885.66936: variable 'ansible_timeout' from source: unknown 33932 1726882885.66943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.67091: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882885.67108: variable 'omit' from source: magic vars 33932 1726882885.67118: starting attempt loop 33932 1726882885.67124: running the handler 33932 1726882885.67139: handler run complete 33932 1726882885.67153: attempt loop complete, returning result 33932 1726882885.67160: _execute() done 33932 1726882885.67171: dumping result to json 33932 1726882885.67179: done dumping result, returning 33932 1726882885.67190: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-615b-5c48-00000000034c] 33932 1726882885.67199: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000034c ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 33932 1726882885.67425: no more pending results, returning what we have 33932 1726882885.67428: results queue empty 33932 1726882885.67429: checking for any_errors_fatal 33932 1726882885.67436: done checking for any_errors_fatal 33932 1726882885.67436: checking for max_fail_percentage 33932 1726882885.67438: done checking for max_fail_percentage 33932 1726882885.67438: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.67439: done checking to see if all hosts have failed 33932 1726882885.67440: getting the remaining hosts for this loop 33932 1726882885.67441: done getting the remaining hosts for this loop 33932 1726882885.67445: getting the next task for host managed_node1 33932 1726882885.67451: done getting next task for host managed_node1 33932 1726882885.67454: ^ task is: TASK: Show current_interfaces 33932 1726882885.67458: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.67461: getting variables 33932 1726882885.67462: in VariableManager get_vars() 33932 1726882885.67509: Calling all_inventory to load vars for managed_node1 33932 1726882885.67512: Calling groups_inventory to load vars for managed_node1 33932 1726882885.67515: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.67527: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.67531: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.67535: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.67710: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000034c 33932 1726882885.67714: WORKER PROCESS EXITING 33932 1726882885.67727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.67929: done with get_vars() 33932 1726882885.67940: done getting variables 33932 1726882885.67993: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:41:25 -0400 (0:00:00.049) 0:00:06.147 ****** 33932 1726882885.68021: entering _queue_task() for managed_node1/debug 33932 1726882885.68552: worker is 1 (out of 1 available) 33932 1726882885.68567: exiting _queue_task() for managed_node1/debug 33932 1726882885.68656: done queuing things up, now waiting for results queue to drain 33932 1726882885.68658: waiting for pending results... 33932 1726882885.69311: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 33932 1726882885.69573: in run() - task 0e448fcc-3ce9-615b-5c48-000000000315 33932 1726882885.69690: variable 'ansible_search_path' from source: unknown 33932 1726882885.69700: variable 'ansible_search_path' from source: unknown 33932 1726882885.69740: calling self._execute() 33932 1726882885.69828: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.70247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.70260: variable 'omit' from source: magic vars 33932 1726882885.71035: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.71054: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.71072: variable 'omit' from source: magic vars 33932 1726882885.71120: variable 'omit' from source: magic vars 33932 1726882885.71224: variable 'current_interfaces' from source: set_fact 33932 1726882885.71481: variable 'omit' from source: magic vars 33932 1726882885.71526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882885.71573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882885.71605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882885.71627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.71645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.71698: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882885.71707: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.71779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.71928: Set connection var ansible_shell_executable to /bin/sh 33932 1726882885.72030: Set connection var ansible_timeout to 10 33932 1726882885.72041: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882885.72050: Set connection var ansible_pipelining to False 33932 1726882885.72057: Set connection var ansible_connection to ssh 33932 1726882885.72067: Set connection var ansible_shell_type to sh 33932 1726882885.72100: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.72110: variable 'ansible_connection' from source: unknown 33932 1726882885.72118: variable 'ansible_module_compression' from source: unknown 33932 1726882885.72130: variable 'ansible_shell_type' from source: unknown 33932 1726882885.72242: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.72250: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.72259: variable 'ansible_pipelining' from source: unknown 33932 1726882885.72272: variable 'ansible_timeout' from source: unknown 33932 1726882885.72283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.72519: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882885.72536: variable 'omit' from source: magic vars 33932 1726882885.72575: starting attempt loop 33932 1726882885.72583: running the handler 33932 1726882885.72724: handler run complete 33932 1726882885.72935: attempt loop complete, returning result 33932 1726882885.72943: _execute() done 33932 1726882885.72951: dumping result to json 33932 1726882885.72959: done dumping result, returning 33932 1726882885.72977: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-615b-5c48-000000000315] 33932 1726882885.72988: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000315 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 33932 1726882885.73152: no more pending results, returning what we have 33932 1726882885.73155: results queue empty 33932 1726882885.73156: checking for any_errors_fatal 33932 1726882885.73161: done checking for any_errors_fatal 33932 1726882885.73162: checking for max_fail_percentage 33932 1726882885.73165: done checking for max_fail_percentage 33932 1726882885.73166: checking to see if all hosts have failed and the running result is not ok 33932 1726882885.73167: done checking to see if all hosts have failed 33932 1726882885.73167: getting the remaining hosts for this loop 33932 1726882885.73172: done getting the remaining hosts for this loop 33932 1726882885.73176: getting the next task for host managed_node1 33932 1726882885.73183: done getting next task for host managed_node1 33932 1726882885.73187: ^ task is: TASK: Install iproute 33932 1726882885.73190: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882885.73194: getting variables 33932 1726882885.73196: in VariableManager get_vars() 33932 1726882885.73239: Calling all_inventory to load vars for managed_node1 33932 1726882885.73243: Calling groups_inventory to load vars for managed_node1 33932 1726882885.73245: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882885.73260: Calling all_plugins_play to load vars for managed_node1 33932 1726882885.73267: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882885.73273: Calling groups_plugins_play to load vars for managed_node1 33932 1726882885.73505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882885.73704: done with get_vars() 33932 1726882885.73715: done getting variables 33932 1726882885.74524: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882885.74545: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000315 33932 1726882885.74548: WORKER PROCESS EXITING TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:41:25 -0400 (0:00:00.065) 0:00:06.213 ****** 33932 1726882885.74560: entering _queue_task() for managed_node1/package 33932 1726882885.74822: worker is 1 (out of 1 available) 33932 1726882885.74833: exiting _queue_task() for managed_node1/package 33932 1726882885.74845: done queuing things up, now waiting for results queue to drain 33932 1726882885.74847: waiting for pending results... 33932 1726882885.75103: running TaskExecutor() for managed_node1/TASK: Install iproute 33932 1726882885.75197: in run() - task 0e448fcc-3ce9-615b-5c48-00000000021e 33932 1726882885.75215: variable 'ansible_search_path' from source: unknown 33932 1726882885.75222: variable 'ansible_search_path' from source: unknown 33932 1726882885.75262: calling self._execute() 33932 1726882885.75350: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.75361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.75378: variable 'omit' from source: magic vars 33932 1726882885.75744: variable 'ansible_distribution_major_version' from source: facts 33932 1726882885.75765: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882885.75780: variable 'omit' from source: magic vars 33932 1726882885.75817: variable 'omit' from source: magic vars 33932 1726882885.76013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882885.78795: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882885.78879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882885.78920: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882885.78959: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882885.79098: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882885.79294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882885.79329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882885.79361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882885.79418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882885.79491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882885.79602: variable '__network_is_ostree' from source: set_fact 33932 1726882885.79728: variable 'omit' from source: magic vars 33932 1726882885.79762: variable 'omit' from source: magic vars 33932 1726882885.79856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882885.79892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882885.79956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882885.79985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.80057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882885.80095: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882885.80160: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.80174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.80382: Set connection var ansible_shell_executable to /bin/sh 33932 1726882885.80397: Set connection var ansible_timeout to 10 33932 1726882885.80413: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882885.80452: Set connection var ansible_pipelining to False 33932 1726882885.80460: Set connection var ansible_connection to ssh 33932 1726882885.80472: Set connection var ansible_shell_type to sh 33932 1726882885.80505: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.80520: variable 'ansible_connection' from source: unknown 33932 1726882885.80529: variable 'ansible_module_compression' from source: unknown 33932 1726882885.80545: variable 'ansible_shell_type' from source: unknown 33932 1726882885.80553: variable 'ansible_shell_executable' from source: unknown 33932 1726882885.80559: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882885.80572: variable 'ansible_pipelining' from source: unknown 33932 1726882885.80580: variable 'ansible_timeout' from source: unknown 33932 1726882885.80592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882885.80705: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882885.80721: variable 'omit' from source: magic vars 33932 1726882885.80731: starting attempt loop 33932 1726882885.80739: running the handler 33932 1726882885.80750: variable 'ansible_facts' from source: unknown 33932 1726882885.80757: variable 'ansible_facts' from source: unknown 33932 1726882885.80797: _low_level_execute_command(): starting 33932 1726882885.80813: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882885.81538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882885.81551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.81566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.81588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.81629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.81640: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882885.81653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.81677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882885.81692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882885.81702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882885.81713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.81726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.81739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.81748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.81756: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882885.81770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.81842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.81878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882885.81895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.82023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.83698: stdout chunk (state=3): >>>/root <<< 33932 1726882885.83861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882885.83880: stderr chunk (state=3): >>><<< 33932 1726882885.83884: stdout chunk (state=3): >>><<< 33932 1726882885.84042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882885.84046: _low_level_execute_command(): starting 33932 1726882885.84049: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760 `" && echo ansible-tmp-1726882885.8396854-34291-83270594284760="` echo /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760 `" ) && sleep 0' 33932 1726882885.85260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882885.85277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.85292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.85310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.85349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.85375: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882885.85390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.85409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882885.85421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882885.85433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882885.85445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882885.85459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882885.85479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882885.85491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882885.85502: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882885.85516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882885.85590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882885.85611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882885.85627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882885.85750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882885.87651: stdout chunk (state=3): >>>ansible-tmp-1726882885.8396854-34291-83270594284760=/root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760 <<< 33932 1726882885.87826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882885.87829: stdout chunk (state=3): >>><<< 33932 1726882885.87831: stderr chunk (state=3): >>><<< 33932 1726882885.88194: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882885.8396854-34291-83270594284760=/root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882885.88198: variable 'ansible_module_compression' from source: unknown 33932 1726882885.88201: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 33932 1726882885.88204: ANSIBALLZ: Acquiring lock 33932 1726882885.88206: ANSIBALLZ: Lock acquired: 140301144901104 33932 1726882885.88208: ANSIBALLZ: Creating module 33932 1726882886.14917: ANSIBALLZ: Writing module into payload 33932 1726882886.15702: ANSIBALLZ: Writing module 33932 1726882886.15956: ANSIBALLZ: Renaming module 33932 1726882886.15978: ANSIBALLZ: Done creating module 33932 1726882886.16000: variable 'ansible_facts' from source: unknown 33932 1726882886.16112: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760/AnsiballZ_dnf.py 33932 1726882886.16266: Sending initial data 33932 1726882886.16272: Sent initial data (151 bytes) 33932 1726882886.17249: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882886.17265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882886.17280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882886.17298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882886.17339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882886.17351: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882886.17370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882886.17388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882886.17399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882886.17409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882886.17424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882886.17435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882886.17449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882886.17459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882886.17473: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882886.17490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882886.17603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882886.17625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882886.17639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882886.17774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882886.19691: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 33932 1726882886.19694: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882886.19841: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882886.19845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp6m5cc2ba /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760/AnsiballZ_dnf.py <<< 33932 1726882886.19938: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882886.21931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882886.21941: stderr chunk (state=3): >>><<< 33932 1726882886.21944: stdout chunk (state=3): >>><<< 33932 1726882886.21968: done transferring module to remote 33932 1726882886.21982: _low_level_execute_command(): starting 33932 1726882886.21985: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760/ /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760/AnsiballZ_dnf.py && sleep 0' 33932 1726882886.22689: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882886.22694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882886.22743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882886.22747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882886.22749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882886.22800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882886.22803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882886.22901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882886.24672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882886.24739: stderr chunk (state=3): >>><<< 33932 1726882886.24743: stdout chunk (state=3): >>><<< 33932 1726882886.24761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882886.24766: _low_level_execute_command(): starting 33932 1726882886.24772: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760/AnsiballZ_dnf.py && sleep 0' 33932 1726882886.25419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882886.25425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882886.25451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882886.25458: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882886.25473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882886.25483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882886.25507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882886.25524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882886.25529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882886.25579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882886.25606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882886.25609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882886.25705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.27643: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 33932 1726882887.33670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882887.33739: stderr chunk (state=3): >>><<< 33932 1726882887.33742: stdout chunk (state=3): >>><<< 33932 1726882887.33762: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882887.33816: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882887.33822: _low_level_execute_command(): starting 33932 1726882887.33827: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882885.8396854-34291-83270594284760/ > /dev/null 2>&1 && sleep 0' 33932 1726882887.34500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882887.34510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.34519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.34532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.34574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.34581: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882887.34591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.34605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882887.34612: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882887.34620: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882887.34627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.34636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.34648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.34656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.34665: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882887.34673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.34745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.34766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.34779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.34898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.37006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.37213: stderr chunk (state=3): >>><<< 33932 1726882887.37280: stdout chunk (state=3): >>><<< 33932 1726882887.37314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882887.37322: handler run complete 33932 1726882887.37537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882887.37741: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882887.37782: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882887.37821: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882887.37848: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882887.37953: variable '__install_status' from source: unknown 33932 1726882887.38040: Evaluated conditional (__install_status is success): True 33932 1726882887.38057: attempt loop complete, returning result 33932 1726882887.38060: _execute() done 33932 1726882887.38065: dumping result to json 33932 1726882887.38072: done dumping result, returning 33932 1726882887.38078: done running TaskExecutor() for managed_node1/TASK: Install iproute [0e448fcc-3ce9-615b-5c48-00000000021e] 33932 1726882887.38087: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021e 33932 1726882887.38352: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021e 33932 1726882887.38356: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 33932 1726882887.38493: no more pending results, returning what we have 33932 1726882887.38497: results queue empty 33932 1726882887.38498: checking for any_errors_fatal 33932 1726882887.38503: done checking for any_errors_fatal 33932 1726882887.38504: checking for max_fail_percentage 33932 1726882887.38505: done checking for max_fail_percentage 33932 1726882887.38506: checking to see if all hosts have failed and the running result is not ok 33932 1726882887.38507: done checking to see if all hosts have failed 33932 1726882887.38507: getting the remaining hosts for this loop 33932 1726882887.38509: done getting the remaining hosts for this loop 33932 1726882887.38513: getting the next task for host managed_node1 33932 1726882887.38519: done getting next task for host managed_node1 33932 1726882887.38521: ^ task is: TASK: Create veth interface {{ interface }} 33932 1726882887.38524: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882887.38528: getting variables 33932 1726882887.38529: in VariableManager get_vars() 33932 1726882887.38570: Calling all_inventory to load vars for managed_node1 33932 1726882887.38574: Calling groups_inventory to load vars for managed_node1 33932 1726882887.38576: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882887.38588: Calling all_plugins_play to load vars for managed_node1 33932 1726882887.38591: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882887.38594: Calling groups_plugins_play to load vars for managed_node1 33932 1726882887.38787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882887.39020: done with get_vars() 33932 1726882887.39032: done getting variables 33932 1726882887.39091: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882887.39897: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:41:27 -0400 (0:00:01.653) 0:00:07.867 ****** 33932 1726882887.39942: entering _queue_task() for managed_node1/command 33932 1726882887.40620: worker is 1 (out of 1 available) 33932 1726882887.40634: exiting _queue_task() for managed_node1/command 33932 1726882887.40646: done queuing things up, now waiting for results queue to drain 33932 1726882887.40647: waiting for pending results... 33932 1726882887.41182: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 33932 1726882887.41985: in run() - task 0e448fcc-3ce9-615b-5c48-00000000021f 33932 1726882887.42011: variable 'ansible_search_path' from source: unknown 33932 1726882887.42022: variable 'ansible_search_path' from source: unknown 33932 1726882887.42382: variable 'interface' from source: play vars 33932 1726882887.42778: variable 'interface' from source: play vars 33932 1726882887.42873: variable 'interface' from source: play vars 33932 1726882887.43149: Loaded config def from plugin (lookup/items) 33932 1726882887.43484: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 33932 1726882887.43520: variable 'omit' from source: magic vars 33932 1726882887.43647: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882887.43899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882887.43926: variable 'omit' from source: magic vars 33932 1726882887.44402: variable 'ansible_distribution_major_version' from source: facts 33932 1726882887.45204: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882887.45608: variable 'type' from source: play vars 33932 1726882887.46386: variable 'state' from source: include params 33932 1726882887.46407: variable 'interface' from source: play vars 33932 1726882887.46428: variable 'current_interfaces' from source: set_fact 33932 1726882887.46453: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 33932 1726882887.46489: variable 'omit' from source: magic vars 33932 1726882887.46569: variable 'omit' from source: magic vars 33932 1726882887.46661: variable 'item' from source: unknown 33932 1726882887.46827: variable 'item' from source: unknown 33932 1726882887.46875: variable 'omit' from source: magic vars 33932 1726882887.46953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882887.47027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882887.47062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882887.47118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882887.47151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882887.47215: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882887.47246: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882887.47270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882887.47462: Set connection var ansible_shell_executable to /bin/sh 33932 1726882887.47503: Set connection var ansible_timeout to 10 33932 1726882887.47524: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882887.47541: Set connection var ansible_pipelining to False 33932 1726882887.47572: Set connection var ansible_connection to ssh 33932 1726882887.47586: Set connection var ansible_shell_type to sh 33932 1726882887.47640: variable 'ansible_shell_executable' from source: unknown 33932 1726882887.47657: variable 'ansible_connection' from source: unknown 33932 1726882887.47674: variable 'ansible_module_compression' from source: unknown 33932 1726882887.47690: variable 'ansible_shell_type' from source: unknown 33932 1726882887.47723: variable 'ansible_shell_executable' from source: unknown 33932 1726882887.47737: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882887.47752: variable 'ansible_pipelining' from source: unknown 33932 1726882887.47771: variable 'ansible_timeout' from source: unknown 33932 1726882887.47809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882887.48062: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882887.48095: variable 'omit' from source: magic vars 33932 1726882887.48117: starting attempt loop 33932 1726882887.48134: running the handler 33932 1726882887.48160: _low_level_execute_command(): starting 33932 1726882887.48182: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882887.49009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882887.49024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.49040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.49058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.49109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.49124: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882887.49137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.49156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882887.49173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882887.49185: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882887.49198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.49212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.49233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.49245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.49256: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882887.49275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.49355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.49382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.49397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.49521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.51225: stdout chunk (state=3): >>>/root <<< 33932 1726882887.51414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.51419: stdout chunk (state=3): >>><<< 33932 1726882887.51429: stderr chunk (state=3): >>><<< 33932 1726882887.51456: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882887.51472: _low_level_execute_command(): starting 33932 1726882887.51476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642 `" && echo ansible-tmp-1726882887.5145535-34369-77319699546642="` echo /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642 `" ) && sleep 0' 33932 1726882887.52150: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882887.52159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.52173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.52186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.52236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.52243: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882887.52252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.52267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882887.52276: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882887.52433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882887.52437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.52439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.52442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.52444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.52449: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882887.52451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.52453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.52455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.52458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.52589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.54480: stdout chunk (state=3): >>>ansible-tmp-1726882887.5145535-34369-77319699546642=/root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642 <<< 33932 1726882887.54673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.54676: stdout chunk (state=3): >>><<< 33932 1726882887.54680: stderr chunk (state=3): >>><<< 33932 1726882887.54770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882887.5145535-34369-77319699546642=/root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882887.54774: variable 'ansible_module_compression' from source: unknown 33932 1726882887.54895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882887.54898: variable 'ansible_facts' from source: unknown 33932 1726882887.54931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642/AnsiballZ_command.py 33932 1726882887.55517: Sending initial data 33932 1726882887.55521: Sent initial data (155 bytes) 33932 1726882887.57796: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.57799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.57836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882887.57839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.57841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.57915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.57918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.57921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.58026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.59787: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882887.59883: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882887.59979: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpafntn9mw /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642/AnsiballZ_command.py <<< 33932 1726882887.60070: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882887.61384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.61698: stderr chunk (state=3): >>><<< 33932 1726882887.61702: stdout chunk (state=3): >>><<< 33932 1726882887.61704: done transferring module to remote 33932 1726882887.61706: _low_level_execute_command(): starting 33932 1726882887.61709: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642/ /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642/AnsiballZ_command.py && sleep 0' 33932 1726882887.62543: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882887.62557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.62576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.62594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.62668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.62684: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882887.62700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.62719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882887.62742: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882887.62755: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882887.62773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.62787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.62801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.62812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.62822: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882887.62836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.62920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.62940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.62962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.63094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.64890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.64894: stdout chunk (state=3): >>><<< 33932 1726882887.64897: stderr chunk (state=3): >>><<< 33932 1726882887.64982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882887.64985: _low_level_execute_command(): starting 33932 1726882887.64988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642/AnsiballZ_command.py && sleep 0' 33932 1726882887.65558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882887.65574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.65588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.65604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.65646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.65666: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882887.65681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.65698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882887.65710: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882887.65721: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882887.65734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.65748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.65773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.65787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.65799: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882887.65813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.65898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.65919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.65935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.66062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.80301: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-20 21:41:27.791467", "end": "2024-09-20 21:41:27.801365", "delta": "0:00:00.009898", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882887.82480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882887.82484: stdout chunk (state=3): >>><<< 33932 1726882887.82487: stderr chunk (state=3): >>><<< 33932 1726882887.82640: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-20 21:41:27.791467", "end": "2024-09-20 21:41:27.801365", "delta": "0:00:00.009898", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882887.82649: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr101 type veth peer name peerlsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882887.82652: _low_level_execute_command(): starting 33932 1726882887.82654: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882887.5145535-34369-77319699546642/ > /dev/null 2>&1 && sleep 0' 33932 1726882887.84521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882887.84562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.84586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.84625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.84695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.84740: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882887.84767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.84786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882887.84799: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882887.84811: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882887.84823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.84837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.84880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.84894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.84906: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882887.84921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.85139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.85186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.85209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.85381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.87334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.87423: stderr chunk (state=3): >>><<< 33932 1726882887.87427: stdout chunk (state=3): >>><<< 33932 1726882887.87553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882887.87557: handler run complete 33932 1726882887.87559: Evaluated conditional (False): False 33932 1726882887.87561: attempt loop complete, returning result 33932 1726882887.87565: variable 'item' from source: unknown 33932 1726882887.87709: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101" ], "delta": "0:00:00.009898", "end": "2024-09-20 21:41:27.801365", "item": "ip link add lsr101 type veth peer name peerlsr101", "rc": 0, "start": "2024-09-20 21:41:27.791467" } 33932 1726882887.87983: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882887.87986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882887.87989: variable 'omit' from source: magic vars 33932 1726882887.88176: variable 'ansible_distribution_major_version' from source: facts 33932 1726882887.88187: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882887.88511: variable 'type' from source: play vars 33932 1726882887.88521: variable 'state' from source: include params 33932 1726882887.88530: variable 'interface' from source: play vars 33932 1726882887.88538: variable 'current_interfaces' from source: set_fact 33932 1726882887.88548: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 33932 1726882887.88566: variable 'omit' from source: magic vars 33932 1726882887.88586: variable 'omit' from source: magic vars 33932 1726882887.88630: variable 'item' from source: unknown 33932 1726882887.88701: variable 'item' from source: unknown 33932 1726882887.88720: variable 'omit' from source: magic vars 33932 1726882887.88744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882887.88756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882887.88775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882887.88793: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882887.88877: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882887.88890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882887.88974: Set connection var ansible_shell_executable to /bin/sh 33932 1726882887.89082: Set connection var ansible_timeout to 10 33932 1726882887.89091: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882887.89106: Set connection var ansible_pipelining to False 33932 1726882887.89112: Set connection var ansible_connection to ssh 33932 1726882887.89118: Set connection var ansible_shell_type to sh 33932 1726882887.89140: variable 'ansible_shell_executable' from source: unknown 33932 1726882887.89147: variable 'ansible_connection' from source: unknown 33932 1726882887.89153: variable 'ansible_module_compression' from source: unknown 33932 1726882887.89166: variable 'ansible_shell_type' from source: unknown 33932 1726882887.89174: variable 'ansible_shell_executable' from source: unknown 33932 1726882887.89216: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882887.89224: variable 'ansible_pipelining' from source: unknown 33932 1726882887.89230: variable 'ansible_timeout' from source: unknown 33932 1726882887.89238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882887.89385: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882887.89481: variable 'omit' from source: magic vars 33932 1726882887.89490: starting attempt loop 33932 1726882887.89496: running the handler 33932 1726882887.89507: _low_level_execute_command(): starting 33932 1726882887.89515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882887.91118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882887.91143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.91173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.91195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.91242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.91254: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882887.91271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.91289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882887.91302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882887.91411: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882887.91482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.91508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882887.91525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.91562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.91617: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882887.91679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.91838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.91898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.91956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.92148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.93707: stdout chunk (state=3): >>>/root <<< 33932 1726882887.93804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.93868: stderr chunk (state=3): >>><<< 33932 1726882887.93872: stdout chunk (state=3): >>><<< 33932 1726882887.93983: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882887.94007: _low_level_execute_command(): starting 33932 1726882887.94010: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504 `" && echo ansible-tmp-1726882887.9388757-34369-45023147073504="` echo /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504 `" ) && sleep 0' 33932 1726882887.95061: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882887.95067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882887.95108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882887.95111: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882887.95113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882887.95116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882887.95171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882887.95175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882887.95192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882887.95311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882887.97374: stdout chunk (state=3): >>>ansible-tmp-1726882887.9388757-34369-45023147073504=/root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504 <<< 33932 1726882887.97378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882887.97396: stderr chunk (state=3): >>><<< 33932 1726882887.97399: stdout chunk (state=3): >>><<< 33932 1726882887.97423: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882887.9388757-34369-45023147073504=/root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882887.97472: variable 'ansible_module_compression' from source: unknown 33932 1726882887.98177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882887.98181: variable 'ansible_facts' from source: unknown 33932 1726882887.98183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504/AnsiballZ_command.py 33932 1726882887.98273: Sending initial data 33932 1726882887.98282: Sent initial data (155 bytes) 33932 1726882888.00201: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.00206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.00276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882888.00283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882888.00288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.00302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.00307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.00832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.00853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.00860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.01004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.02736: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882888.02851: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882888.02923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp4a0xoj56 /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504/AnsiballZ_command.py <<< 33932 1726882888.03011: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882888.04657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.04746: stderr chunk (state=3): >>><<< 33932 1726882888.04749: stdout chunk (state=3): >>><<< 33932 1726882888.04773: done transferring module to remote 33932 1726882888.04781: _low_level_execute_command(): starting 33932 1726882888.04786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504/ /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504/AnsiballZ_command.py && sleep 0' 33932 1726882888.06481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.06515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.06521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.06574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882888.06580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.06623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.06708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.06711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.06724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.06848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.08637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.08731: stderr chunk (state=3): >>><<< 33932 1726882888.08734: stdout chunk (state=3): >>><<< 33932 1726882888.08772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.08776: _low_level_execute_command(): starting 33932 1726882888.08778: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504/AnsiballZ_command.py && sleep 0' 33932 1726882888.09390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.09399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.09408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.09421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.09456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.09462: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.09474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.09487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.09494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.09500: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.09574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.09579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.09581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.09583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.09585: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.09618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.09686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.09697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.09709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.09830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.23334: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-20 21:41:28.228226", "end": "2024-09-20 21:41:28.231942", "delta": "0:00:00.003716", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882888.24648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882888.24652: stderr chunk (state=3): >>><<< 33932 1726882888.24655: stdout chunk (state=3): >>><<< 33932 1726882888.24682: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-20 21:41:28.228226", "end": "2024-09-20 21:41:28.231942", "delta": "0:00:00.003716", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882888.24714: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882888.24720: _low_level_execute_command(): starting 33932 1726882888.24725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882887.9388757-34369-45023147073504/ > /dev/null 2>&1 && sleep 0' 33932 1726882888.25388: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.25397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.25408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.25423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.25462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.25472: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.25481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.25497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.25504: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.25510: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.25518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.25526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.25541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.25544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.25550: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.25560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.25635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.25652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.25666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.25793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.27676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.27680: stdout chunk (state=3): >>><<< 33932 1726882888.27688: stderr chunk (state=3): >>><<< 33932 1726882888.27704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.27711: handler run complete 33932 1726882888.27731: Evaluated conditional (False): False 33932 1726882888.27740: attempt loop complete, returning result 33932 1726882888.27760: variable 'item' from source: unknown 33932 1726882888.27842: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr101", "up" ], "delta": "0:00:00.003716", "end": "2024-09-20 21:41:28.231942", "item": "ip link set peerlsr101 up", "rc": 0, "start": "2024-09-20 21:41:28.228226" } 33932 1726882888.28049: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882888.28053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882888.28056: variable 'omit' from source: magic vars 33932 1726882888.28186: variable 'ansible_distribution_major_version' from source: facts 33932 1726882888.28192: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882888.28455: variable 'type' from source: play vars 33932 1726882888.28459: variable 'state' from source: include params 33932 1726882888.28461: variable 'interface' from source: play vars 33932 1726882888.28471: variable 'current_interfaces' from source: set_fact 33932 1726882888.28474: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 33932 1726882888.28480: variable 'omit' from source: magic vars 33932 1726882888.28501: variable 'omit' from source: magic vars 33932 1726882888.28577: variable 'item' from source: unknown 33932 1726882888.28675: variable 'item' from source: unknown 33932 1726882888.28702: variable 'omit' from source: magic vars 33932 1726882888.28746: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882888.28773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882888.28788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882888.28808: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882888.28816: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882888.28825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882888.28948: Set connection var ansible_shell_executable to /bin/sh 33932 1726882888.28962: Set connection var ansible_timeout to 10 33932 1726882888.28987: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882888.28998: Set connection var ansible_pipelining to False 33932 1726882888.29004: Set connection var ansible_connection to ssh 33932 1726882888.29010: Set connection var ansible_shell_type to sh 33932 1726882888.29038: variable 'ansible_shell_executable' from source: unknown 33932 1726882888.29051: variable 'ansible_connection' from source: unknown 33932 1726882888.29060: variable 'ansible_module_compression' from source: unknown 33932 1726882888.29070: variable 'ansible_shell_type' from source: unknown 33932 1726882888.29098: variable 'ansible_shell_executable' from source: unknown 33932 1726882888.29108: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882888.29122: variable 'ansible_pipelining' from source: unknown 33932 1726882888.29134: variable 'ansible_timeout' from source: unknown 33932 1726882888.29145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882888.29287: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882888.29310: variable 'omit' from source: magic vars 33932 1726882888.29323: starting attempt loop 33932 1726882888.29329: running the handler 33932 1726882888.29338: _low_level_execute_command(): starting 33932 1726882888.29345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882888.30170: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.30195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.30210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.30227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.30276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.30293: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.30313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.30330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.30341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.30351: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.30366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.30382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.30402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.30420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.30431: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.30444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.30535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.30559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.30578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.30709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.32329: stdout chunk (state=3): >>>/root <<< 33932 1726882888.32429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.32526: stderr chunk (state=3): >>><<< 33932 1726882888.32536: stdout chunk (state=3): >>><<< 33932 1726882888.32662: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.32672: _low_level_execute_command(): starting 33932 1726882888.32675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733 `" && echo ansible-tmp-1726882888.325671-34369-33645656378733="` echo /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733 `" ) && sleep 0' 33932 1726882888.33350: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.33365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.33380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.33398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.33454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.33468: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.33483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.33500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.33511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.33527: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.33546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.33560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.33578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.33590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.33600: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.33612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.33704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.33726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.33749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.33892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.35765: stdout chunk (state=3): >>>ansible-tmp-1726882888.325671-34369-33645656378733=/root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733 <<< 33932 1726882888.35882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.35975: stderr chunk (state=3): >>><<< 33932 1726882888.35991: stdout chunk (state=3): >>><<< 33932 1726882888.36177: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882888.325671-34369-33645656378733=/root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.36180: variable 'ansible_module_compression' from source: unknown 33932 1726882888.36182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882888.36184: variable 'ansible_facts' from source: unknown 33932 1726882888.36186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733/AnsiballZ_command.py 33932 1726882888.36302: Sending initial data 33932 1726882888.36305: Sent initial data (154 bytes) 33932 1726882888.37266: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.37290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.37305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.37323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.37366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.37379: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.37402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.37420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.37431: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.37440: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.37451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.37463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.37481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.37492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.37512: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.37527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.37602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.37633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.37650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.37801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.39634: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 33932 1726882888.39666: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882888.39704: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882888.39800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpzsur7a6f /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733/AnsiballZ_command.py <<< 33932 1726882888.39932: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882888.41483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.41631: stderr chunk (state=3): >>><<< 33932 1726882888.41634: stdout chunk (state=3): >>><<< 33932 1726882888.41654: done transferring module to remote 33932 1726882888.41662: _low_level_execute_command(): starting 33932 1726882888.41668: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733/ /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733/AnsiballZ_command.py && sleep 0' 33932 1726882888.42803: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.42809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.42869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882888.42878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.42883: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.42888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.42903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882888.42908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.42985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.42993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.43005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.43120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.44969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.44997: stderr chunk (state=3): >>><<< 33932 1726882888.45001: stdout chunk (state=3): >>><<< 33932 1726882888.45031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.45110: _low_level_execute_command(): starting 33932 1726882888.45113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733/AnsiballZ_command.py && sleep 0' 33932 1726882888.45694: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.45706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.45719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.45734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.45778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.45795: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.45808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.45823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.45834: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.45843: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.45853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.45869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.45886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.45898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.45913: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.45925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.46005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.46030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.46045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.46172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.60016: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-20 21:41:28.592158", "end": "2024-09-20 21:41:28.598748", "delta": "0:00:00.006590", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882888.61184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882888.61290: stderr chunk (state=3): >>><<< 33932 1726882888.61293: stdout chunk (state=3): >>><<< 33932 1726882888.61388: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-20 21:41:28.592158", "end": "2024-09-20 21:41:28.598748", "delta": "0:00:00.006590", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882888.61399: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882888.61401: _low_level_execute_command(): starting 33932 1726882888.61404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882888.325671-34369-33645656378733/ > /dev/null 2>&1 && sleep 0' 33932 1726882888.62431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.62476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.62494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.62589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.62633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.62644: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.62656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.62678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.62693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.62711: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.62722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.62734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.62748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.62758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.62771: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.62784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.62970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.62992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.63010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.63154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.65034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.65037: stdout chunk (state=3): >>><<< 33932 1726882888.65039: stderr chunk (state=3): >>><<< 33932 1726882888.65276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.65280: handler run complete 33932 1726882888.65282: Evaluated conditional (False): False 33932 1726882888.65284: attempt loop complete, returning result 33932 1726882888.65286: variable 'item' from source: unknown 33932 1726882888.65288: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr101", "up" ], "delta": "0:00:00.006590", "end": "2024-09-20 21:41:28.598748", "item": "ip link set lsr101 up", "rc": 0, "start": "2024-09-20 21:41:28.592158" } 33932 1726882888.65390: dumping result to json 33932 1726882888.65393: done dumping result, returning 33932 1726882888.65396: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 [0e448fcc-3ce9-615b-5c48-00000000021f] 33932 1726882888.65398: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021f 33932 1726882888.65575: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000021f 33932 1726882888.65578: WORKER PROCESS EXITING 33932 1726882888.65694: no more pending results, returning what we have 33932 1726882888.65698: results queue empty 33932 1726882888.65699: checking for any_errors_fatal 33932 1726882888.65704: done checking for any_errors_fatal 33932 1726882888.65705: checking for max_fail_percentage 33932 1726882888.65706: done checking for max_fail_percentage 33932 1726882888.65707: checking to see if all hosts have failed and the running result is not ok 33932 1726882888.65708: done checking to see if all hosts have failed 33932 1726882888.65708: getting the remaining hosts for this loop 33932 1726882888.65710: done getting the remaining hosts for this loop 33932 1726882888.65714: getting the next task for host managed_node1 33932 1726882888.65719: done getting next task for host managed_node1 33932 1726882888.65721: ^ task is: TASK: Set up veth as managed by NetworkManager 33932 1726882888.65724: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882888.65727: getting variables 33932 1726882888.65730: in VariableManager get_vars() 33932 1726882888.65765: Calling all_inventory to load vars for managed_node1 33932 1726882888.65768: Calling groups_inventory to load vars for managed_node1 33932 1726882888.65771: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882888.65781: Calling all_plugins_play to load vars for managed_node1 33932 1726882888.65784: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882888.65787: Calling groups_plugins_play to load vars for managed_node1 33932 1726882888.65957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882888.66338: done with get_vars() 33932 1726882888.66348: done getting variables 33932 1726882888.66494: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:41:28 -0400 (0:00:01.265) 0:00:09.132 ****** 33932 1726882888.66530: entering _queue_task() for managed_node1/command 33932 1726882888.67058: worker is 1 (out of 1 available) 33932 1726882888.67100: exiting _queue_task() for managed_node1/command 33932 1726882888.67111: done queuing things up, now waiting for results queue to drain 33932 1726882888.67113: waiting for pending results... 33932 1726882888.67547: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 33932 1726882888.67653: in run() - task 0e448fcc-3ce9-615b-5c48-000000000220 33932 1726882888.67673: variable 'ansible_search_path' from source: unknown 33932 1726882888.67681: variable 'ansible_search_path' from source: unknown 33932 1726882888.67736: calling self._execute() 33932 1726882888.67830: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882888.67845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882888.67857: variable 'omit' from source: magic vars 33932 1726882888.68228: variable 'ansible_distribution_major_version' from source: facts 33932 1726882888.68245: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882888.68424: variable 'type' from source: play vars 33932 1726882888.68602: variable 'state' from source: include params 33932 1726882888.68613: Evaluated conditional (type == 'veth' and state == 'present'): True 33932 1726882888.68623: variable 'omit' from source: magic vars 33932 1726882888.68659: variable 'omit' from source: magic vars 33932 1726882888.68875: variable 'interface' from source: play vars 33932 1726882888.68895: variable 'omit' from source: magic vars 33932 1726882888.68954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882888.69066: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882888.69157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882888.69181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882888.69196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882888.69274: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882888.69283: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882888.69359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882888.69582: Set connection var ansible_shell_executable to /bin/sh 33932 1726882888.69595: Set connection var ansible_timeout to 10 33932 1726882888.69604: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882888.69612: Set connection var ansible_pipelining to False 33932 1726882888.69619: Set connection var ansible_connection to ssh 33932 1726882888.69624: Set connection var ansible_shell_type to sh 33932 1726882888.69747: variable 'ansible_shell_executable' from source: unknown 33932 1726882888.69865: variable 'ansible_connection' from source: unknown 33932 1726882888.69874: variable 'ansible_module_compression' from source: unknown 33932 1726882888.69882: variable 'ansible_shell_type' from source: unknown 33932 1726882888.69892: variable 'ansible_shell_executable' from source: unknown 33932 1726882888.69916: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882888.69924: variable 'ansible_pipelining' from source: unknown 33932 1726882888.69983: variable 'ansible_timeout' from source: unknown 33932 1726882888.70055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882888.70422: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882888.70444: variable 'omit' from source: magic vars 33932 1726882888.70453: starting attempt loop 33932 1726882888.70466: running the handler 33932 1726882888.70486: _low_level_execute_command(): starting 33932 1726882888.70516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882888.71329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.71347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.71361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.71384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.71425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.71447: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.71460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.71481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.71493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.71502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.71513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.71525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.71539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.71560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.71575: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.71590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.71674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.71698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.71712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.71833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.73433: stdout chunk (state=3): >>>/root <<< 33932 1726882888.73613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.73616: stdout chunk (state=3): >>><<< 33932 1726882888.73619: stderr chunk (state=3): >>><<< 33932 1726882888.73732: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.73737: _low_level_execute_command(): starting 33932 1726882888.73746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770 `" && echo ansible-tmp-1726882888.736469-34423-145807226562770="` echo /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770 `" ) && sleep 0' 33932 1726882888.77677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.77695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.77726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.77835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.77880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.77935: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.77955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.77977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.77990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.78002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.78014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.78033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.78054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.78155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.78176: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.78192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.78391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.78414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.78432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.78604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.80494: stdout chunk (state=3): >>>ansible-tmp-1726882888.736469-34423-145807226562770=/root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770 <<< 33932 1726882888.80673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.80676: stdout chunk (state=3): >>><<< 33932 1726882888.80680: stderr chunk (state=3): >>><<< 33932 1726882888.80875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882888.736469-34423-145807226562770=/root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.80879: variable 'ansible_module_compression' from source: unknown 33932 1726882888.80881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882888.80883: variable 'ansible_facts' from source: unknown 33932 1726882888.80927: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770/AnsiballZ_command.py 33932 1726882888.81522: Sending initial data 33932 1726882888.81526: Sent initial data (155 bytes) 33932 1726882888.84117: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.84184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.84198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.84215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.84369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.84384: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.84403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.84421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.84432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.84442: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.84454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.84477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.84496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.84513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.84524: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.84538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.84651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.84745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.84762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.84919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.86698: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882888.86790: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882888.86893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpn588wuqt /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770/AnsiballZ_command.py <<< 33932 1726882888.86988: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882888.88596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.88765: stderr chunk (state=3): >>><<< 33932 1726882888.88768: stdout chunk (state=3): >>><<< 33932 1726882888.88771: done transferring module to remote 33932 1726882888.88773: _low_level_execute_command(): starting 33932 1726882888.88776: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770/ /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770/AnsiballZ_command.py && sleep 0' 33932 1726882888.90359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.90375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.90388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.90403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.90452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.90536: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.90551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.90572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.90585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.90595: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.90606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.90619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.90639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.90654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.90665: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.90680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.90870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.90887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.90901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.91081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882888.92879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882888.92939: stderr chunk (state=3): >>><<< 33932 1726882888.92942: stdout chunk (state=3): >>><<< 33932 1726882888.93034: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882888.93038: _low_level_execute_command(): starting 33932 1726882888.93043: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770/AnsiballZ_command.py && sleep 0' 33932 1726882888.94357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882888.94383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.94399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.94417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.94469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.94487: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882888.94502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.94520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882888.94533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882888.94545: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882888.94565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882888.94584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882888.94603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882888.94617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882888.94629: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882888.94643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882888.94725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882888.94743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882888.94757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882888.94919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882889.10285: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-20 21:41:29.080931", "end": "2024-09-20 21:41:29.099307", "delta": "0:00:00.018376", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882889.11430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882889.11436: stderr chunk (state=3): >>><<< 33932 1726882889.11439: stdout chunk (state=3): >>><<< 33932 1726882889.11602: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-20 21:41:29.080931", "end": "2024-09-20 21:41:29.099307", "delta": "0:00:00.018376", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882889.11641: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr101 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882889.11646: _low_level_execute_command(): starting 33932 1726882889.11653: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882888.736469-34423-145807226562770/ > /dev/null 2>&1 && sleep 0' 33932 1726882889.13209: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882889.13217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.13228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.13241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.13337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882889.13423: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882889.13434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.13447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882889.13455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882889.13462: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882889.13480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.13494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.13502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.13508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882889.13517: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882889.13525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.13698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882889.13715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882889.13727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882889.13848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882889.15759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882889.15766: stdout chunk (state=3): >>><<< 33932 1726882889.15784: stderr chunk (state=3): >>><<< 33932 1726882889.15796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882889.15803: handler run complete 33932 1726882889.15827: Evaluated conditional (False): False 33932 1726882889.15837: attempt loop complete, returning result 33932 1726882889.15840: _execute() done 33932 1726882889.15842: dumping result to json 33932 1726882889.15848: done dumping result, returning 33932 1726882889.15856: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-615b-5c48-000000000220] 33932 1726882889.15861: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000220 33932 1726882889.15968: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000220 33932 1726882889.15972: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr101", "managed", "true" ], "delta": "0:00:00.018376", "end": "2024-09-20 21:41:29.099307", "rc": 0, "start": "2024-09-20 21:41:29.080931" } 33932 1726882889.16066: no more pending results, returning what we have 33932 1726882889.16069: results queue empty 33932 1726882889.16071: checking for any_errors_fatal 33932 1726882889.16084: done checking for any_errors_fatal 33932 1726882889.16084: checking for max_fail_percentage 33932 1726882889.16086: done checking for max_fail_percentage 33932 1726882889.16086: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.16087: done checking to see if all hosts have failed 33932 1726882889.16088: getting the remaining hosts for this loop 33932 1726882889.16090: done getting the remaining hosts for this loop 33932 1726882889.16093: getting the next task for host managed_node1 33932 1726882889.16098: done getting next task for host managed_node1 33932 1726882889.16100: ^ task is: TASK: Delete veth interface {{ interface }} 33932 1726882889.16103: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.16106: getting variables 33932 1726882889.16108: in VariableManager get_vars() 33932 1726882889.16144: Calling all_inventory to load vars for managed_node1 33932 1726882889.16146: Calling groups_inventory to load vars for managed_node1 33932 1726882889.16148: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.16158: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.16160: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.16162: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.16335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.16579: done with get_vars() 33932 1726882889.16593: done getting variables 33932 1726882889.16649: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882889.16989: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:41:29 -0400 (0:00:00.504) 0:00:09.637 ****** 33932 1726882889.17239: entering _queue_task() for managed_node1/command 33932 1726882889.17914: worker is 1 (out of 1 available) 33932 1726882889.17926: exiting _queue_task() for managed_node1/command 33932 1726882889.17937: done queuing things up, now waiting for results queue to drain 33932 1726882889.17939: waiting for pending results... 33932 1726882889.18562: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 33932 1726882889.18781: in run() - task 0e448fcc-3ce9-615b-5c48-000000000221 33932 1726882889.18805: variable 'ansible_search_path' from source: unknown 33932 1726882889.18919: variable 'ansible_search_path' from source: unknown 33932 1726882889.18956: calling self._execute() 33932 1726882889.19046: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.19138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.19151: variable 'omit' from source: magic vars 33932 1726882889.19742: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.19902: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.20275: variable 'type' from source: play vars 33932 1726882889.20286: variable 'state' from source: include params 33932 1726882889.20294: variable 'interface' from source: play vars 33932 1726882889.20301: variable 'current_interfaces' from source: set_fact 33932 1726882889.20329: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 33932 1726882889.20336: when evaluation is False, skipping this task 33932 1726882889.20341: _execute() done 33932 1726882889.20376: dumping result to json 33932 1726882889.20392: done dumping result, returning 33932 1726882889.20422: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000221] 33932 1726882889.20511: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000221 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882889.20831: no more pending results, returning what we have 33932 1726882889.20851: results queue empty 33932 1726882889.20869: checking for any_errors_fatal 33932 1726882889.20907: done checking for any_errors_fatal 33932 1726882889.20908: checking for max_fail_percentage 33932 1726882889.20910: done checking for max_fail_percentage 33932 1726882889.20911: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.20935: done checking to see if all hosts have failed 33932 1726882889.20937: getting the remaining hosts for this loop 33932 1726882889.20939: done getting the remaining hosts for this loop 33932 1726882889.20958: getting the next task for host managed_node1 33932 1726882889.20981: done getting next task for host managed_node1 33932 1726882889.20985: ^ task is: TASK: Create dummy interface {{ interface }} 33932 1726882889.20988: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.21029: getting variables 33932 1726882889.21032: in VariableManager get_vars() 33932 1726882889.21162: Calling all_inventory to load vars for managed_node1 33932 1726882889.21167: Calling groups_inventory to load vars for managed_node1 33932 1726882889.21169: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.21181: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.21184: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.21187: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.21553: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000221 33932 1726882889.21557: WORKER PROCESS EXITING 33932 1726882889.21576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.22029: done with get_vars() 33932 1726882889.22040: done getting variables 33932 1726882889.22305: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882889.22745: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:41:29 -0400 (0:00:00.058) 0:00:09.695 ****** 33932 1726882889.22830: entering _queue_task() for managed_node1/command 33932 1726882889.23692: worker is 1 (out of 1 available) 33932 1726882889.23751: exiting _queue_task() for managed_node1/command 33932 1726882889.23771: done queuing things up, now waiting for results queue to drain 33932 1726882889.23775: waiting for pending results... 33932 1726882889.24827: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 33932 1726882889.25046: in run() - task 0e448fcc-3ce9-615b-5c48-000000000222 33932 1726882889.25187: variable 'ansible_search_path' from source: unknown 33932 1726882889.25196: variable 'ansible_search_path' from source: unknown 33932 1726882889.25236: calling self._execute() 33932 1726882889.25332: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.25401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.25486: variable 'omit' from source: magic vars 33932 1726882889.26131: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.26273: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.26695: variable 'type' from source: play vars 33932 1726882889.26707: variable 'state' from source: include params 33932 1726882889.26715: variable 'interface' from source: play vars 33932 1726882889.26722: variable 'current_interfaces' from source: set_fact 33932 1726882889.26732: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 33932 1726882889.26738: when evaluation is False, skipping this task 33932 1726882889.26743: _execute() done 33932 1726882889.26749: dumping result to json 33932 1726882889.26755: done dumping result, returning 33932 1726882889.26765: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000222] 33932 1726882889.26776: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000222 33932 1726882889.26920: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000222 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882889.26995: no more pending results, returning what we have 33932 1726882889.27000: results queue empty 33932 1726882889.27003: checking for any_errors_fatal 33932 1726882889.27015: done checking for any_errors_fatal 33932 1726882889.27019: checking for max_fail_percentage 33932 1726882889.27022: done checking for max_fail_percentage 33932 1726882889.27022: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.27023: done checking to see if all hosts have failed 33932 1726882889.27024: getting the remaining hosts for this loop 33932 1726882889.27026: done getting the remaining hosts for this loop 33932 1726882889.27030: getting the next task for host managed_node1 33932 1726882889.27036: done getting next task for host managed_node1 33932 1726882889.27038: ^ task is: TASK: Delete dummy interface {{ interface }} 33932 1726882889.27041: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.27047: getting variables 33932 1726882889.27049: in VariableManager get_vars() 33932 1726882889.27100: Calling all_inventory to load vars for managed_node1 33932 1726882889.27106: Calling groups_inventory to load vars for managed_node1 33932 1726882889.27113: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.27126: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.27129: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.27132: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.27450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.28245: done with get_vars() 33932 1726882889.28255: done getting variables 33932 1726882889.28446: WORKER PROCESS EXITING 33932 1726882889.28515: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882889.29035: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:41:29 -0400 (0:00:00.062) 0:00:09.758 ****** 33932 1726882889.29112: entering _queue_task() for managed_node1/command 33932 1726882889.29602: worker is 1 (out of 1 available) 33932 1726882889.29615: exiting _queue_task() for managed_node1/command 33932 1726882889.29625: done queuing things up, now waiting for results queue to drain 33932 1726882889.29627: waiting for pending results... 33932 1726882889.30817: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 33932 1726882889.30896: in run() - task 0e448fcc-3ce9-615b-5c48-000000000223 33932 1726882889.31140: variable 'ansible_search_path' from source: unknown 33932 1726882889.31144: variable 'ansible_search_path' from source: unknown 33932 1726882889.31178: calling self._execute() 33932 1726882889.31379: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.31383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.31393: variable 'omit' from source: magic vars 33932 1726882889.32365: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.32378: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.33120: variable 'type' from source: play vars 33932 1726882889.33202: variable 'state' from source: include params 33932 1726882889.33242: variable 'interface' from source: play vars 33932 1726882889.33246: variable 'current_interfaces' from source: set_fact 33932 1726882889.33251: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 33932 1726882889.33254: when evaluation is False, skipping this task 33932 1726882889.33256: _execute() done 33932 1726882889.33259: dumping result to json 33932 1726882889.33261: done dumping result, returning 33932 1726882889.33272: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000223] 33932 1726882889.33275: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000223 33932 1726882889.33376: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000223 33932 1726882889.33381: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882889.33452: no more pending results, returning what we have 33932 1726882889.33460: results queue empty 33932 1726882889.33461: checking for any_errors_fatal 33932 1726882889.33485: done checking for any_errors_fatal 33932 1726882889.33487: checking for max_fail_percentage 33932 1726882889.33488: done checking for max_fail_percentage 33932 1726882889.33489: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.33490: done checking to see if all hosts have failed 33932 1726882889.33491: getting the remaining hosts for this loop 33932 1726882889.33493: done getting the remaining hosts for this loop 33932 1726882889.33501: getting the next task for host managed_node1 33932 1726882889.33510: done getting next task for host managed_node1 33932 1726882889.33515: ^ task is: TASK: Create tap interface {{ interface }} 33932 1726882889.33519: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.33523: getting variables 33932 1726882889.33525: in VariableManager get_vars() 33932 1726882889.33576: Calling all_inventory to load vars for managed_node1 33932 1726882889.33579: Calling groups_inventory to load vars for managed_node1 33932 1726882889.33581: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.33593: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.33596: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.33598: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.33833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.34089: done with get_vars() 33932 1726882889.34184: done getting variables 33932 1726882889.34243: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882889.34399: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:41:29 -0400 (0:00:00.054) 0:00:09.812 ****** 33932 1726882889.34581: entering _queue_task() for managed_node1/command 33932 1726882889.35353: worker is 1 (out of 1 available) 33932 1726882889.35402: exiting _queue_task() for managed_node1/command 33932 1726882889.35436: done queuing things up, now waiting for results queue to drain 33932 1726882889.35438: waiting for pending results... 33932 1726882889.36730: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 33932 1726882889.36908: in run() - task 0e448fcc-3ce9-615b-5c48-000000000224 33932 1726882889.37048: variable 'ansible_search_path' from source: unknown 33932 1726882889.37059: variable 'ansible_search_path' from source: unknown 33932 1726882889.37105: calling self._execute() 33932 1726882889.37202: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.37372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.37387: variable 'omit' from source: magic vars 33932 1726882889.37976: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.38086: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.38690: variable 'type' from source: play vars 33932 1726882889.38700: variable 'state' from source: include params 33932 1726882889.38708: variable 'interface' from source: play vars 33932 1726882889.38715: variable 'current_interfaces' from source: set_fact 33932 1726882889.38785: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 33932 1726882889.38796: when evaluation is False, skipping this task 33932 1726882889.38805: _execute() done 33932 1726882889.38812: dumping result to json 33932 1726882889.38819: done dumping result, returning 33932 1726882889.38827: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000224] 33932 1726882889.38836: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000224 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882889.38976: no more pending results, returning what we have 33932 1726882889.38981: results queue empty 33932 1726882889.38982: checking for any_errors_fatal 33932 1726882889.38989: done checking for any_errors_fatal 33932 1726882889.38990: checking for max_fail_percentage 33932 1726882889.38991: done checking for max_fail_percentage 33932 1726882889.38992: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.38993: done checking to see if all hosts have failed 33932 1726882889.38994: getting the remaining hosts for this loop 33932 1726882889.38995: done getting the remaining hosts for this loop 33932 1726882889.38999: getting the next task for host managed_node1 33932 1726882889.39005: done getting next task for host managed_node1 33932 1726882889.39008: ^ task is: TASK: Delete tap interface {{ interface }} 33932 1726882889.39011: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.39014: getting variables 33932 1726882889.39016: in VariableManager get_vars() 33932 1726882889.39058: Calling all_inventory to load vars for managed_node1 33932 1726882889.39061: Calling groups_inventory to load vars for managed_node1 33932 1726882889.39065: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.39080: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.39083: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.39086: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.39332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.39550: done with get_vars() 33932 1726882889.39560: done getting variables 33932 1726882889.39826: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000224 33932 1726882889.39829: WORKER PROCESS EXITING 33932 1726882889.39870: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882889.40191: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:41:29 -0400 (0:00:00.056) 0:00:09.869 ****** 33932 1726882889.40217: entering _queue_task() for managed_node1/command 33932 1726882889.41650: worker is 1 (out of 1 available) 33932 1726882889.41665: exiting _queue_task() for managed_node1/command 33932 1726882889.41681: done queuing things up, now waiting for results queue to drain 33932 1726882889.41683: waiting for pending results... 33932 1726882889.42361: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 33932 1726882889.42585: in run() - task 0e448fcc-3ce9-615b-5c48-000000000225 33932 1726882889.42686: variable 'ansible_search_path' from source: unknown 33932 1726882889.42694: variable 'ansible_search_path' from source: unknown 33932 1726882889.42758: calling self._execute() 33932 1726882889.42905: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.42959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.43051: variable 'omit' from source: magic vars 33932 1726882889.43687: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.43784: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.44239: variable 'type' from source: play vars 33932 1726882889.44255: variable 'state' from source: include params 33932 1726882889.44348: variable 'interface' from source: play vars 33932 1726882889.44362: variable 'current_interfaces' from source: set_fact 33932 1726882889.44378: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 33932 1726882889.44385: when evaluation is False, skipping this task 33932 1726882889.44391: _execute() done 33932 1726882889.44398: dumping result to json 33932 1726882889.44405: done dumping result, returning 33932 1726882889.44414: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000225] 33932 1726882889.44423: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000225 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882889.44564: no more pending results, returning what we have 33932 1726882889.44571: results queue empty 33932 1726882889.44572: checking for any_errors_fatal 33932 1726882889.44579: done checking for any_errors_fatal 33932 1726882889.44580: checking for max_fail_percentage 33932 1726882889.44582: done checking for max_fail_percentage 33932 1726882889.44583: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.44583: done checking to see if all hosts have failed 33932 1726882889.44584: getting the remaining hosts for this loop 33932 1726882889.44586: done getting the remaining hosts for this loop 33932 1726882889.44590: getting the next task for host managed_node1 33932 1726882889.44598: done getting next task for host managed_node1 33932 1726882889.44602: ^ task is: TASK: Include the task 'assert_device_present.yml' 33932 1726882889.44604: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.44608: getting variables 33932 1726882889.44610: in VariableManager get_vars() 33932 1726882889.44649: Calling all_inventory to load vars for managed_node1 33932 1726882889.44652: Calling groups_inventory to load vars for managed_node1 33932 1726882889.44654: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.44673: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.44678: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.44683: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.44845: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000225 33932 1726882889.44849: WORKER PROCESS EXITING 33932 1726882889.44870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.45072: done with get_vars() 33932 1726882889.45084: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:16 Friday 20 September 2024 21:41:29 -0400 (0:00:00.049) 0:00:09.921 ****** 33932 1726882889.45365: entering _queue_task() for managed_node1/include_tasks 33932 1726882889.46058: worker is 1 (out of 1 available) 33932 1726882889.46075: exiting _queue_task() for managed_node1/include_tasks 33932 1726882889.46088: done queuing things up, now waiting for results queue to drain 33932 1726882889.46090: waiting for pending results... 33932 1726882889.46460: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 33932 1726882889.46660: in run() - task 0e448fcc-3ce9-615b-5c48-00000000000d 33932 1726882889.46684: variable 'ansible_search_path' from source: unknown 33932 1726882889.46834: calling self._execute() 33932 1726882889.46918: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.47025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.47040: variable 'omit' from source: magic vars 33932 1726882889.47720: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.47736: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.47746: _execute() done 33932 1726882889.47754: dumping result to json 33932 1726882889.47761: done dumping result, returning 33932 1726882889.47779: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-615b-5c48-00000000000d] 33932 1726882889.47793: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000d 33932 1726882889.47917: no more pending results, returning what we have 33932 1726882889.47922: in VariableManager get_vars() 33932 1726882889.47971: Calling all_inventory to load vars for managed_node1 33932 1726882889.47975: Calling groups_inventory to load vars for managed_node1 33932 1726882889.47978: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.47993: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.47996: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.47999: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.48221: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000d 33932 1726882889.48225: WORKER PROCESS EXITING 33932 1726882889.48239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.48430: done with get_vars() 33932 1726882889.48440: variable 'ansible_search_path' from source: unknown 33932 1726882889.48452: we have included files to process 33932 1726882889.48453: generating all_blocks data 33932 1726882889.48455: done generating all_blocks data 33932 1726882889.48459: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 33932 1726882889.48460: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 33932 1726882889.48462: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 33932 1726882889.48808: in VariableManager get_vars() 33932 1726882889.48828: done with get_vars() 33932 1726882889.49891: done processing included file 33932 1726882889.49893: iterating over new_blocks loaded from include file 33932 1726882889.49895: in VariableManager get_vars() 33932 1726882889.50212: done with get_vars() 33932 1726882889.50215: filtering new block on tags 33932 1726882889.50236: done filtering new block on tags 33932 1726882889.50238: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 33932 1726882889.50243: extending task lists for all hosts with included blocks 33932 1726882889.56756: done extending task lists 33932 1726882889.56758: done processing included files 33932 1726882889.56760: results queue empty 33932 1726882889.56760: checking for any_errors_fatal 33932 1726882889.56766: done checking for any_errors_fatal 33932 1726882889.56766: checking for max_fail_percentage 33932 1726882889.56768: done checking for max_fail_percentage 33932 1726882889.56768: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.56769: done checking to see if all hosts have failed 33932 1726882889.56770: getting the remaining hosts for this loop 33932 1726882889.56771: done getting the remaining hosts for this loop 33932 1726882889.56774: getting the next task for host managed_node1 33932 1726882889.56778: done getting next task for host managed_node1 33932 1726882889.56780: ^ task is: TASK: Include the task 'get_interface_stat.yml' 33932 1726882889.56782: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.56785: getting variables 33932 1726882889.56786: in VariableManager get_vars() 33932 1726882889.56803: Calling all_inventory to load vars for managed_node1 33932 1726882889.56806: Calling groups_inventory to load vars for managed_node1 33932 1726882889.56808: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.56814: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.56816: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.56819: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.56969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.57778: done with get_vars() 33932 1726882889.57789: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:41:29 -0400 (0:00:00.125) 0:00:10.046 ****** 33932 1726882889.57870: entering _queue_task() for managed_node1/include_tasks 33932 1726882889.58663: worker is 1 (out of 1 available) 33932 1726882889.58681: exiting _queue_task() for managed_node1/include_tasks 33932 1726882889.58696: done queuing things up, now waiting for results queue to drain 33932 1726882889.58698: waiting for pending results... 33932 1726882889.59583: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 33932 1726882889.59807: in run() - task 0e448fcc-3ce9-615b-5c48-00000000038b 33932 1726882889.59959: variable 'ansible_search_path' from source: unknown 33932 1726882889.59976: variable 'ansible_search_path' from source: unknown 33932 1726882889.60020: calling self._execute() 33932 1726882889.60227: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.60238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.60251: variable 'omit' from source: magic vars 33932 1726882889.60994: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.61161: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.61180: _execute() done 33932 1726882889.61190: dumping result to json 33932 1726882889.61198: done dumping result, returning 33932 1726882889.61206: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-615b-5c48-00000000038b] 33932 1726882889.61214: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000038b 33932 1726882889.61343: no more pending results, returning what we have 33932 1726882889.61349: in VariableManager get_vars() 33932 1726882889.61405: Calling all_inventory to load vars for managed_node1 33932 1726882889.61409: Calling groups_inventory to load vars for managed_node1 33932 1726882889.61411: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.61425: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.61428: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.61431: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.61633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.61847: done with get_vars() 33932 1726882889.61854: variable 'ansible_search_path' from source: unknown 33932 1726882889.61855: variable 'ansible_search_path' from source: unknown 33932 1726882889.61907: we have included files to process 33932 1726882889.61908: generating all_blocks data 33932 1726882889.61911: done generating all_blocks data 33932 1726882889.61912: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 33932 1726882889.61913: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 33932 1726882889.61915: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 33932 1726882889.62459: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000038b 33932 1726882889.62462: WORKER PROCESS EXITING 33932 1726882889.62708: done processing included file 33932 1726882889.62710: iterating over new_blocks loaded from include file 33932 1726882889.62711: in VariableManager get_vars() 33932 1726882889.62729: done with get_vars() 33932 1726882889.62730: filtering new block on tags 33932 1726882889.62743: done filtering new block on tags 33932 1726882889.62745: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 33932 1726882889.62749: extending task lists for all hosts with included blocks 33932 1726882889.63026: done extending task lists 33932 1726882889.63027: done processing included files 33932 1726882889.63029: results queue empty 33932 1726882889.63029: checking for any_errors_fatal 33932 1726882889.63033: done checking for any_errors_fatal 33932 1726882889.63033: checking for max_fail_percentage 33932 1726882889.63034: done checking for max_fail_percentage 33932 1726882889.63035: checking to see if all hosts have failed and the running result is not ok 33932 1726882889.63036: done checking to see if all hosts have failed 33932 1726882889.63036: getting the remaining hosts for this loop 33932 1726882889.63038: done getting the remaining hosts for this loop 33932 1726882889.63041: getting the next task for host managed_node1 33932 1726882889.63044: done getting next task for host managed_node1 33932 1726882889.63046: ^ task is: TASK: Get stat for interface {{ interface }} 33932 1726882889.63049: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882889.63051: getting variables 33932 1726882889.63052: in VariableManager get_vars() 33932 1726882889.63071: Calling all_inventory to load vars for managed_node1 33932 1726882889.63073: Calling groups_inventory to load vars for managed_node1 33932 1726882889.63075: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882889.63081: Calling all_plugins_play to load vars for managed_node1 33932 1726882889.63083: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882889.63085: Calling groups_plugins_play to load vars for managed_node1 33932 1726882889.63513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882889.63929: done with get_vars() 33932 1726882889.63938: done getting variables 33932 1726882889.64300: variable 'interface' from source: play vars TASK [Get stat for interface lsr101] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:41:29 -0400 (0:00:00.065) 0:00:10.112 ****** 33932 1726882889.64445: entering _queue_task() for managed_node1/stat 33932 1726882889.65589: worker is 1 (out of 1 available) 33932 1726882889.65601: exiting _queue_task() for managed_node1/stat 33932 1726882889.65611: done queuing things up, now waiting for results queue to drain 33932 1726882889.65612: waiting for pending results... 33932 1726882889.66358: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101 33932 1726882889.66585: in run() - task 0e448fcc-3ce9-615b-5c48-0000000004a4 33932 1726882889.66605: variable 'ansible_search_path' from source: unknown 33932 1726882889.66614: variable 'ansible_search_path' from source: unknown 33932 1726882889.66652: calling self._execute() 33932 1726882889.66845: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.66856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.66891: variable 'omit' from source: magic vars 33932 1726882889.67566: variable 'ansible_distribution_major_version' from source: facts 33932 1726882889.67655: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882889.67667: variable 'omit' from source: magic vars 33932 1726882889.67714: variable 'omit' from source: magic vars 33932 1726882889.67840: variable 'interface' from source: play vars 33932 1726882889.67986: variable 'omit' from source: magic vars 33932 1726882889.68028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882889.68103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882889.68200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882889.68218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882889.68233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882889.68265: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882889.68404: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.68413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.68520: Set connection var ansible_shell_executable to /bin/sh 33932 1726882889.68626: Set connection var ansible_timeout to 10 33932 1726882889.68636: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882889.68645: Set connection var ansible_pipelining to False 33932 1726882889.68652: Set connection var ansible_connection to ssh 33932 1726882889.68658: Set connection var ansible_shell_type to sh 33932 1726882889.68690: variable 'ansible_shell_executable' from source: unknown 33932 1726882889.68698: variable 'ansible_connection' from source: unknown 33932 1726882889.68705: variable 'ansible_module_compression' from source: unknown 33932 1726882889.68712: variable 'ansible_shell_type' from source: unknown 33932 1726882889.68723: variable 'ansible_shell_executable' from source: unknown 33932 1726882889.68731: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882889.68838: variable 'ansible_pipelining' from source: unknown 33932 1726882889.68846: variable 'ansible_timeout' from source: unknown 33932 1726882889.68854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882889.69280: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882889.69296: variable 'omit' from source: magic vars 33932 1726882889.69306: starting attempt loop 33932 1726882889.69313: running the handler 33932 1726882889.69330: _low_level_execute_command(): starting 33932 1726882889.69343: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882889.71070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882889.71130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.71146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.71167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.71214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882889.71346: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882889.71362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.71386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882889.71399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882889.71410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882889.71422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.71437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.71459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.71476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882889.71491: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882889.71508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.71589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882889.71614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882889.71683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882889.71818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882889.73503: stdout chunk (state=3): >>>/root <<< 33932 1726882889.73695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882889.73700: stdout chunk (state=3): >>><<< 33932 1726882889.73703: stderr chunk (state=3): >>><<< 33932 1726882889.73771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882889.73776: _low_level_execute_command(): starting 33932 1726882889.73780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073 `" && echo ansible-tmp-1726882889.7372468-34463-280312558358073="` echo /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073 `" ) && sleep 0' 33932 1726882889.74606: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.74610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.74630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.74653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.74656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882889.74660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.74733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882889.74738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882889.74741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882889.74847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882889.76756: stdout chunk (state=3): >>>ansible-tmp-1726882889.7372468-34463-280312558358073=/root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073 <<< 33932 1726882889.76952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882889.76955: stdout chunk (state=3): >>><<< 33932 1726882889.76957: stderr chunk (state=3): >>><<< 33932 1726882889.77172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882889.7372468-34463-280312558358073=/root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882889.77176: variable 'ansible_module_compression' from source: unknown 33932 1726882889.77179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 33932 1726882889.77181: variable 'ansible_facts' from source: unknown 33932 1726882889.77237: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073/AnsiballZ_stat.py 33932 1726882889.77861: Sending initial data 33932 1726882889.77867: Sent initial data (153 bytes) 33932 1726882889.80116: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.80119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.80176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882889.80180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.80883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882889.80890: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882889.80903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.81368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882889.81372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882889.81374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882889.81376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882889.83041: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882889.83132: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882889.83229: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpvwud2gpu /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073/AnsiballZ_stat.py <<< 33932 1726882889.83317: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882889.84848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882889.85052: stderr chunk (state=3): >>><<< 33932 1726882889.85055: stdout chunk (state=3): >>><<< 33932 1726882889.85058: done transferring module to remote 33932 1726882889.85060: _low_level_execute_command(): starting 33932 1726882889.85062: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073/ /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073/AnsiballZ_stat.py && sleep 0' 33932 1726882889.87133: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.87137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.87174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882889.87179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.87185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.87356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882889.87384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882889.87531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882889.89450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882889.89453: stdout chunk (state=3): >>><<< 33932 1726882889.89461: stderr chunk (state=3): >>><<< 33932 1726882889.89483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882889.89486: _low_level_execute_command(): starting 33932 1726882889.89489: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073/AnsiballZ_stat.py && sleep 0' 33932 1726882889.91427: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882889.91552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.91561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.91580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.91617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882889.91658: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882889.91669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.91687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882889.91770: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882889.91780: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882889.91881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882889.91892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882889.91903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882889.91911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882889.91918: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882889.91928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882889.92008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882889.92099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882889.92186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882889.92325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882890.05798: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30723, "dev": 21, "nlink": 1, "atime": 1726882887.7951593, "mtime": 1726882887.7951593, "ctime": 1726882887.7951593, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 33932 1726882890.06899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882890.06903: stdout chunk (state=3): >>><<< 33932 1726882890.06906: stderr chunk (state=3): >>><<< 33932 1726882890.07068: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30723, "dev": 21, "nlink": 1, "atime": 1726882887.7951593, "mtime": 1726882887.7951593, "ctime": 1726882887.7951593, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882890.07072: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882890.07107: _low_level_execute_command(): starting 33932 1726882890.07112: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882889.7372468-34463-280312558358073/ > /dev/null 2>&1 && sleep 0' 33932 1726882890.08124: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882890.08146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.08162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.08188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.08229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.08308: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882890.08323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.08343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882890.08356: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882890.08379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882890.08393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.08415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.08431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.08497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.08509: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882890.08526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.08640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882890.08688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882890.08716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882890.08851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882890.10775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882890.10778: stdout chunk (state=3): >>><<< 33932 1726882890.10780: stderr chunk (state=3): >>><<< 33932 1726882890.11172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882890.11175: handler run complete 33932 1726882890.11178: attempt loop complete, returning result 33932 1726882890.11180: _execute() done 33932 1726882890.11182: dumping result to json 33932 1726882890.11184: done dumping result, returning 33932 1726882890.11186: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101 [0e448fcc-3ce9-615b-5c48-0000000004a4] 33932 1726882890.11188: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004a4 33932 1726882890.11265: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004a4 33932 1726882890.11269: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882887.7951593, "block_size": 4096, "blocks": 0, "ctime": 1726882887.7951593, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30723, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "mode": "0777", "mtime": 1726882887.7951593, "nlink": 1, "path": "/sys/class/net/lsr101", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 33932 1726882890.11357: no more pending results, returning what we have 33932 1726882890.11361: results queue empty 33932 1726882890.11362: checking for any_errors_fatal 33932 1726882890.11366: done checking for any_errors_fatal 33932 1726882890.11367: checking for max_fail_percentage 33932 1726882890.11368: done checking for max_fail_percentage 33932 1726882890.11369: checking to see if all hosts have failed and the running result is not ok 33932 1726882890.11370: done checking to see if all hosts have failed 33932 1726882890.11371: getting the remaining hosts for this loop 33932 1726882890.11372: done getting the remaining hosts for this loop 33932 1726882890.11376: getting the next task for host managed_node1 33932 1726882890.11382: done getting next task for host managed_node1 33932 1726882890.11384: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 33932 1726882890.11387: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882890.11391: getting variables 33932 1726882890.11392: in VariableManager get_vars() 33932 1726882890.11428: Calling all_inventory to load vars for managed_node1 33932 1726882890.11430: Calling groups_inventory to load vars for managed_node1 33932 1726882890.11432: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.11443: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.11445: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.11448: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.11616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.11815: done with get_vars() 33932 1726882890.11826: done getting variables 33932 1726882890.11922: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 33932 1726882890.12042: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'lsr101'] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:41:30 -0400 (0:00:00.476) 0:00:10.588 ****** 33932 1726882890.12073: entering _queue_task() for managed_node1/assert 33932 1726882890.12075: Creating lock for assert 33932 1726882890.12325: worker is 1 (out of 1 available) 33932 1726882890.12337: exiting _queue_task() for managed_node1/assert 33932 1726882890.12348: done queuing things up, now waiting for results queue to drain 33932 1726882890.12350: waiting for pending results... 33932 1726882890.12648: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101' 33932 1726882890.12774: in run() - task 0e448fcc-3ce9-615b-5c48-00000000038c 33932 1726882890.12812: variable 'ansible_search_path' from source: unknown 33932 1726882890.12819: variable 'ansible_search_path' from source: unknown 33932 1726882890.13329: calling self._execute() 33932 1726882890.13414: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.13424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.13436: variable 'omit' from source: magic vars 33932 1726882890.14049: variable 'ansible_distribution_major_version' from source: facts 33932 1726882890.14068: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882890.14080: variable 'omit' from source: magic vars 33932 1726882890.14115: variable 'omit' from source: magic vars 33932 1726882890.14213: variable 'interface' from source: play vars 33932 1726882890.14384: variable 'omit' from source: magic vars 33932 1726882890.14428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882890.14583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882890.14609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882890.14631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882890.14646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882890.14680: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882890.14693: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.14750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.14891: Set connection var ansible_shell_executable to /bin/sh 33932 1726882890.14907: Set connection var ansible_timeout to 10 33932 1726882890.14917: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882890.14927: Set connection var ansible_pipelining to False 33932 1726882890.14934: Set connection var ansible_connection to ssh 33932 1726882890.14940: Set connection var ansible_shell_type to sh 33932 1726882890.14969: variable 'ansible_shell_executable' from source: unknown 33932 1726882890.14978: variable 'ansible_connection' from source: unknown 33932 1726882890.14984: variable 'ansible_module_compression' from source: unknown 33932 1726882890.14991: variable 'ansible_shell_type' from source: unknown 33932 1726882890.14997: variable 'ansible_shell_executable' from source: unknown 33932 1726882890.15003: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.15014: variable 'ansible_pipelining' from source: unknown 33932 1726882890.15021: variable 'ansible_timeout' from source: unknown 33932 1726882890.15028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.15160: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882890.15178: variable 'omit' from source: magic vars 33932 1726882890.15188: starting attempt loop 33932 1726882890.15195: running the handler 33932 1726882890.15329: variable 'interface_stat' from source: set_fact 33932 1726882890.15356: Evaluated conditional (interface_stat.stat.exists): True 33932 1726882890.15370: handler run complete 33932 1726882890.15388: attempt loop complete, returning result 33932 1726882890.15395: _execute() done 33932 1726882890.15400: dumping result to json 33932 1726882890.15407: done dumping result, returning 33932 1726882890.15417: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101' [0e448fcc-3ce9-615b-5c48-00000000038c] 33932 1726882890.15426: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000038c 33932 1726882890.15523: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000038c 33932 1726882890.15529: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882890.15600: no more pending results, returning what we have 33932 1726882890.15603: results queue empty 33932 1726882890.15604: checking for any_errors_fatal 33932 1726882890.15612: done checking for any_errors_fatal 33932 1726882890.15613: checking for max_fail_percentage 33932 1726882890.15614: done checking for max_fail_percentage 33932 1726882890.15615: checking to see if all hosts have failed and the running result is not ok 33932 1726882890.15616: done checking to see if all hosts have failed 33932 1726882890.15617: getting the remaining hosts for this loop 33932 1726882890.15619: done getting the remaining hosts for this loop 33932 1726882890.15623: getting the next task for host managed_node1 33932 1726882890.15630: done getting next task for host managed_node1 33932 1726882890.15634: ^ task is: TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 33932 1726882890.15636: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882890.15640: getting variables 33932 1726882890.15641: in VariableManager get_vars() 33932 1726882890.15685: Calling all_inventory to load vars for managed_node1 33932 1726882890.15688: Calling groups_inventory to load vars for managed_node1 33932 1726882890.15691: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.15702: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.15706: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.15709: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.15923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.16119: done with get_vars() 33932 1726882890.16130: done getting variables 33932 1726882890.16376: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure the MTU for a vlan interface without autoconnect.] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:18 Friday 20 September 2024 21:41:30 -0400 (0:00:00.043) 0:00:10.631 ****** 33932 1726882890.16401: entering _queue_task() for managed_node1/debug 33932 1726882890.16611: worker is 1 (out of 1 available) 33932 1726882890.16623: exiting _queue_task() for managed_node1/debug 33932 1726882890.16633: done queuing things up, now waiting for results queue to drain 33932 1726882890.16635: waiting for pending results... 33932 1726882890.17295: running TaskExecutor() for managed_node1/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 33932 1726882890.17378: in run() - task 0e448fcc-3ce9-615b-5c48-00000000000e 33932 1726882890.17396: variable 'ansible_search_path' from source: unknown 33932 1726882890.17451: calling self._execute() 33932 1726882890.17539: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.17577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.17628: variable 'omit' from source: magic vars 33932 1726882890.18460: variable 'ansible_distribution_major_version' from source: facts 33932 1726882890.18483: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882890.18493: variable 'omit' from source: magic vars 33932 1726882890.18514: variable 'omit' from source: magic vars 33932 1726882890.18550: variable 'omit' from source: magic vars 33932 1726882890.18596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882890.18633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882890.18658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882890.18685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882890.18699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882890.18798: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882890.18806: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.18812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.18902: Set connection var ansible_shell_executable to /bin/sh 33932 1726882890.18942: Set connection var ansible_timeout to 10 33932 1726882890.18953: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882890.18962: Set connection var ansible_pipelining to False 33932 1726882890.18973: Set connection var ansible_connection to ssh 33932 1726882890.18980: Set connection var ansible_shell_type to sh 33932 1726882890.19024: variable 'ansible_shell_executable' from source: unknown 33932 1726882890.19038: variable 'ansible_connection' from source: unknown 33932 1726882890.19046: variable 'ansible_module_compression' from source: unknown 33932 1726882890.19053: variable 'ansible_shell_type' from source: unknown 33932 1726882890.19059: variable 'ansible_shell_executable' from source: unknown 33932 1726882890.19067: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.19075: variable 'ansible_pipelining' from source: unknown 33932 1726882890.19082: variable 'ansible_timeout' from source: unknown 33932 1726882890.19090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.19229: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882890.19245: variable 'omit' from source: magic vars 33932 1726882890.19306: starting attempt loop 33932 1726882890.19314: running the handler 33932 1726882890.19403: handler run complete 33932 1726882890.19428: attempt loop complete, returning result 33932 1726882890.19436: _execute() done 33932 1726882890.19444: dumping result to json 33932 1726882890.19447: done dumping result, returning 33932 1726882890.19455: done running TaskExecutor() for managed_node1/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. [0e448fcc-3ce9-615b-5c48-00000000000e] 33932 1726882890.19466: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000e ok: [managed_node1] => {} MSG: ################################################## 33932 1726882890.19693: no more pending results, returning what we have 33932 1726882890.19696: results queue empty 33932 1726882890.19698: checking for any_errors_fatal 33932 1726882890.19702: done checking for any_errors_fatal 33932 1726882890.19703: checking for max_fail_percentage 33932 1726882890.19705: done checking for max_fail_percentage 33932 1726882890.19706: checking to see if all hosts have failed and the running result is not ok 33932 1726882890.19707: done checking to see if all hosts have failed 33932 1726882890.19708: getting the remaining hosts for this loop 33932 1726882890.19710: done getting the remaining hosts for this loop 33932 1726882890.19714: getting the next task for host managed_node1 33932 1726882890.19723: done getting next task for host managed_node1 33932 1726882890.19729: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33932 1726882890.19733: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882890.19750: getting variables 33932 1726882890.19752: in VariableManager get_vars() 33932 1726882890.19795: Calling all_inventory to load vars for managed_node1 33932 1726882890.19797: Calling groups_inventory to load vars for managed_node1 33932 1726882890.19800: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.19809: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.19812: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.19814: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.20000: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000000e 33932 1726882890.20004: WORKER PROCESS EXITING 33932 1726882890.20023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.20274: done with get_vars() 33932 1726882890.20285: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:30 -0400 (0:00:00.041) 0:00:10.673 ****** 33932 1726882890.20551: entering _queue_task() for managed_node1/include_tasks 33932 1726882890.21446: worker is 1 (out of 1 available) 33932 1726882890.21458: exiting _queue_task() for managed_node1/include_tasks 33932 1726882890.21481: done queuing things up, now waiting for results queue to drain 33932 1726882890.21483: waiting for pending results... 33932 1726882890.21646: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33932 1726882890.22087: in run() - task 0e448fcc-3ce9-615b-5c48-000000000016 33932 1726882890.22121: variable 'ansible_search_path' from source: unknown 33932 1726882890.22129: variable 'ansible_search_path' from source: unknown 33932 1726882890.22178: calling self._execute() 33932 1726882890.22270: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.22413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.22426: variable 'omit' from source: magic vars 33932 1726882890.24193: variable 'ansible_distribution_major_version' from source: facts 33932 1726882890.24212: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882890.24222: _execute() done 33932 1726882890.24230: dumping result to json 33932 1726882890.24238: done dumping result, returning 33932 1726882890.24249: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-615b-5c48-000000000016] 33932 1726882890.24260: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000016 33932 1726882890.24392: no more pending results, returning what we have 33932 1726882890.24397: in VariableManager get_vars() 33932 1726882890.24440: Calling all_inventory to load vars for managed_node1 33932 1726882890.24443: Calling groups_inventory to load vars for managed_node1 33932 1726882890.24445: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.24456: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.24459: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.24462: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.24904: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000016 33932 1726882890.24909: WORKER PROCESS EXITING 33932 1726882890.24933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.25317: done with get_vars() 33932 1726882890.25324: variable 'ansible_search_path' from source: unknown 33932 1726882890.25325: variable 'ansible_search_path' from source: unknown 33932 1726882890.25388: we have included files to process 33932 1726882890.25390: generating all_blocks data 33932 1726882890.25391: done generating all_blocks data 33932 1726882890.25394: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 33932 1726882890.25395: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 33932 1726882890.25396: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 33932 1726882890.26065: done processing included file 33932 1726882890.26067: iterating over new_blocks loaded from include file 33932 1726882890.26069: in VariableManager get_vars() 33932 1726882890.26092: done with get_vars() 33932 1726882890.26094: filtering new block on tags 33932 1726882890.26111: done filtering new block on tags 33932 1726882890.26113: in VariableManager get_vars() 33932 1726882890.26135: done with get_vars() 33932 1726882890.26137: filtering new block on tags 33932 1726882890.26157: done filtering new block on tags 33932 1726882890.26159: in VariableManager get_vars() 33932 1726882890.26184: done with get_vars() 33932 1726882890.26185: filtering new block on tags 33932 1726882890.26204: done filtering new block on tags 33932 1726882890.26206: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 33932 1726882890.26210: extending task lists for all hosts with included blocks 33932 1726882890.27058: done extending task lists 33932 1726882890.27059: done processing included files 33932 1726882890.27060: results queue empty 33932 1726882890.27061: checking for any_errors_fatal 33932 1726882890.27064: done checking for any_errors_fatal 33932 1726882890.27065: checking for max_fail_percentage 33932 1726882890.27066: done checking for max_fail_percentage 33932 1726882890.27067: checking to see if all hosts have failed and the running result is not ok 33932 1726882890.27068: done checking to see if all hosts have failed 33932 1726882890.27068: getting the remaining hosts for this loop 33932 1726882890.27070: done getting the remaining hosts for this loop 33932 1726882890.27072: getting the next task for host managed_node1 33932 1726882890.27076: done getting next task for host managed_node1 33932 1726882890.27079: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 33932 1726882890.27082: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882890.27090: getting variables 33932 1726882890.27091: in VariableManager get_vars() 33932 1726882890.27105: Calling all_inventory to load vars for managed_node1 33932 1726882890.27107: Calling groups_inventory to load vars for managed_node1 33932 1726882890.27109: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.27114: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.27116: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.27119: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.27260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.27611: done with get_vars() 33932 1726882890.27619: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:30 -0400 (0:00:00.071) 0:00:10.744 ****** 33932 1726882890.27686: entering _queue_task() for managed_node1/setup 33932 1726882890.27897: worker is 1 (out of 1 available) 33932 1726882890.27907: exiting _queue_task() for managed_node1/setup 33932 1726882890.27917: done queuing things up, now waiting for results queue to drain 33932 1726882890.27918: waiting for pending results... 33932 1726882890.28934: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 33932 1726882890.29109: in run() - task 0e448fcc-3ce9-615b-5c48-0000000004bf 33932 1726882890.29136: variable 'ansible_search_path' from source: unknown 33932 1726882890.29147: variable 'ansible_search_path' from source: unknown 33932 1726882890.29196: calling self._execute() 33932 1726882890.29293: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.29303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.29317: variable 'omit' from source: magic vars 33932 1726882890.29837: variable 'ansible_distribution_major_version' from source: facts 33932 1726882890.29855: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882890.30113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882890.32775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882890.32844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882890.32895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882890.32935: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882890.32988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882890.33073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882890.33119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882890.33150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882890.33209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882890.33231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882890.33291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882890.33328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882890.33359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882890.33410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882890.33432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882890.33607: variable '__network_required_facts' from source: role '' defaults 33932 1726882890.33625: variable 'ansible_facts' from source: unknown 33932 1726882890.33721: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 33932 1726882890.33733: when evaluation is False, skipping this task 33932 1726882890.33744: _execute() done 33932 1726882890.33750: dumping result to json 33932 1726882890.33757: done dumping result, returning 33932 1726882890.33774: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-615b-5c48-0000000004bf] 33932 1726882890.33784: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004bf skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882890.33924: no more pending results, returning what we have 33932 1726882890.33928: results queue empty 33932 1726882890.33929: checking for any_errors_fatal 33932 1726882890.33931: done checking for any_errors_fatal 33932 1726882890.33932: checking for max_fail_percentage 33932 1726882890.33933: done checking for max_fail_percentage 33932 1726882890.33934: checking to see if all hosts have failed and the running result is not ok 33932 1726882890.33935: done checking to see if all hosts have failed 33932 1726882890.33936: getting the remaining hosts for this loop 33932 1726882890.33938: done getting the remaining hosts for this loop 33932 1726882890.33941: getting the next task for host managed_node1 33932 1726882890.33950: done getting next task for host managed_node1 33932 1726882890.33954: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 33932 1726882890.33958: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882890.33974: getting variables 33932 1726882890.33976: in VariableManager get_vars() 33932 1726882890.34018: Calling all_inventory to load vars for managed_node1 33932 1726882890.34021: Calling groups_inventory to load vars for managed_node1 33932 1726882890.34023: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.34033: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.34036: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.34039: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.34247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.34496: done with get_vars() 33932 1726882890.34507: done getting variables 33932 1726882890.34714: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004bf 33932 1726882890.34717: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:30 -0400 (0:00:00.070) 0:00:10.814 ****** 33932 1726882890.34740: entering _queue_task() for managed_node1/stat 33932 1726882890.35170: worker is 1 (out of 1 available) 33932 1726882890.35184: exiting _queue_task() for managed_node1/stat 33932 1726882890.35196: done queuing things up, now waiting for results queue to drain 33932 1726882890.35198: waiting for pending results... 33932 1726882890.35501: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 33932 1726882890.35713: in run() - task 0e448fcc-3ce9-615b-5c48-0000000004c1 33932 1726882890.35734: variable 'ansible_search_path' from source: unknown 33932 1726882890.35742: variable 'ansible_search_path' from source: unknown 33932 1726882890.35790: calling self._execute() 33932 1726882890.35923: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.35934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.35947: variable 'omit' from source: magic vars 33932 1726882890.36374: variable 'ansible_distribution_major_version' from source: facts 33932 1726882890.36391: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882890.36581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882890.36877: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882890.36932: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882890.36980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882890.37029: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882890.37129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882890.37163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882890.37203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882890.37243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882890.37347: variable '__network_is_ostree' from source: set_fact 33932 1726882890.37359: Evaluated conditional (not __network_is_ostree is defined): False 33932 1726882890.37374: when evaluation is False, skipping this task 33932 1726882890.37382: _execute() done 33932 1726882890.37394: dumping result to json 33932 1726882890.37402: done dumping result, returning 33932 1726882890.37412: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-615b-5c48-0000000004c1] 33932 1726882890.37423: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c1 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 33932 1726882890.37589: no more pending results, returning what we have 33932 1726882890.37593: results queue empty 33932 1726882890.37594: checking for any_errors_fatal 33932 1726882890.37600: done checking for any_errors_fatal 33932 1726882890.37601: checking for max_fail_percentage 33932 1726882890.37602: done checking for max_fail_percentage 33932 1726882890.37603: checking to see if all hosts have failed and the running result is not ok 33932 1726882890.37604: done checking to see if all hosts have failed 33932 1726882890.37605: getting the remaining hosts for this loop 33932 1726882890.37607: done getting the remaining hosts for this loop 33932 1726882890.37611: getting the next task for host managed_node1 33932 1726882890.37617: done getting next task for host managed_node1 33932 1726882890.37621: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 33932 1726882890.37625: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882890.37637: getting variables 33932 1726882890.37639: in VariableManager get_vars() 33932 1726882890.37686: Calling all_inventory to load vars for managed_node1 33932 1726882890.37690: Calling groups_inventory to load vars for managed_node1 33932 1726882890.37692: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.37702: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.37704: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.37707: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.37899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.38139: done with get_vars() 33932 1726882890.38149: done getting variables 33932 1726882890.38322: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882890.38351: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c1 33932 1726882890.38353: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:30 -0400 (0:00:00.036) 0:00:10.851 ****** 33932 1726882890.38372: entering _queue_task() for managed_node1/set_fact 33932 1726882890.38704: worker is 1 (out of 1 available) 33932 1726882890.38716: exiting _queue_task() for managed_node1/set_fact 33932 1726882890.38727: done queuing things up, now waiting for results queue to drain 33932 1726882890.38729: waiting for pending results... 33932 1726882890.39014: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 33932 1726882890.39211: in run() - task 0e448fcc-3ce9-615b-5c48-0000000004c2 33932 1726882890.39236: variable 'ansible_search_path' from source: unknown 33932 1726882890.39245: variable 'ansible_search_path' from source: unknown 33932 1726882890.39311: calling self._execute() 33932 1726882890.39427: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.39437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.39449: variable 'omit' from source: magic vars 33932 1726882890.39910: variable 'ansible_distribution_major_version' from source: facts 33932 1726882890.39931: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882890.40208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882890.40535: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882890.40589: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882890.40634: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882890.40686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882890.40913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882890.40948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882890.40986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882890.41026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882890.41118: variable '__network_is_ostree' from source: set_fact 33932 1726882890.41137: Evaluated conditional (not __network_is_ostree is defined): False 33932 1726882890.41148: when evaluation is False, skipping this task 33932 1726882890.41154: _execute() done 33932 1726882890.41161: dumping result to json 33932 1726882890.41173: done dumping result, returning 33932 1726882890.41184: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-615b-5c48-0000000004c2] 33932 1726882890.41193: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c2 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 33932 1726882890.41326: no more pending results, returning what we have 33932 1726882890.41330: results queue empty 33932 1726882890.41331: checking for any_errors_fatal 33932 1726882890.41338: done checking for any_errors_fatal 33932 1726882890.41338: checking for max_fail_percentage 33932 1726882890.41340: done checking for max_fail_percentage 33932 1726882890.41341: checking to see if all hosts have failed and the running result is not ok 33932 1726882890.41342: done checking to see if all hosts have failed 33932 1726882890.41343: getting the remaining hosts for this loop 33932 1726882890.41345: done getting the remaining hosts for this loop 33932 1726882890.41348: getting the next task for host managed_node1 33932 1726882890.41356: done getting next task for host managed_node1 33932 1726882890.41361: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 33932 1726882890.41366: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882890.41380: getting variables 33932 1726882890.41382: in VariableManager get_vars() 33932 1726882890.41422: Calling all_inventory to load vars for managed_node1 33932 1726882890.41425: Calling groups_inventory to load vars for managed_node1 33932 1726882890.41427: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882890.41437: Calling all_plugins_play to load vars for managed_node1 33932 1726882890.41440: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882890.41443: Calling groups_plugins_play to load vars for managed_node1 33932 1726882890.41645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882890.41846: done with get_vars() 33932 1726882890.41855: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:30 -0400 (0:00:00.035) 0:00:10.887 ****** 33932 1726882890.41949: entering _queue_task() for managed_node1/service_facts 33932 1726882890.41952: Creating lock for service_facts 33932 1726882890.42093: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c2 33932 1726882890.42096: WORKER PROCESS EXITING 33932 1726882890.42365: worker is 1 (out of 1 available) 33932 1726882890.42379: exiting _queue_task() for managed_node1/service_facts 33932 1726882890.42390: done queuing things up, now waiting for results queue to drain 33932 1726882890.42392: waiting for pending results... 33932 1726882890.42642: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 33932 1726882890.42782: in run() - task 0e448fcc-3ce9-615b-5c48-0000000004c4 33932 1726882890.42799: variable 'ansible_search_path' from source: unknown 33932 1726882890.42807: variable 'ansible_search_path' from source: unknown 33932 1726882890.42849: calling self._execute() 33932 1726882890.42929: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.42944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.42957: variable 'omit' from source: magic vars 33932 1726882890.43320: variable 'ansible_distribution_major_version' from source: facts 33932 1726882890.43337: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882890.43347: variable 'omit' from source: magic vars 33932 1726882890.43428: variable 'omit' from source: magic vars 33932 1726882890.43467: variable 'omit' from source: magic vars 33932 1726882890.43514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882890.43551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882890.43578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882890.43602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882890.43617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882890.43647: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882890.43656: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.43663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.43774: Set connection var ansible_shell_executable to /bin/sh 33932 1726882890.43788: Set connection var ansible_timeout to 10 33932 1726882890.43796: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882890.43804: Set connection var ansible_pipelining to False 33932 1726882890.43814: Set connection var ansible_connection to ssh 33932 1726882890.43820: Set connection var ansible_shell_type to sh 33932 1726882890.43844: variable 'ansible_shell_executable' from source: unknown 33932 1726882890.43851: variable 'ansible_connection' from source: unknown 33932 1726882890.43858: variable 'ansible_module_compression' from source: unknown 33932 1726882890.43866: variable 'ansible_shell_type' from source: unknown 33932 1726882890.43875: variable 'ansible_shell_executable' from source: unknown 33932 1726882890.43882: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882890.43889: variable 'ansible_pipelining' from source: unknown 33932 1726882890.43895: variable 'ansible_timeout' from source: unknown 33932 1726882890.43901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882890.44088: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882890.44102: variable 'omit' from source: magic vars 33932 1726882890.44110: starting attempt loop 33932 1726882890.44116: running the handler 33932 1726882890.44133: _low_level_execute_command(): starting 33932 1726882890.44146: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882890.44907: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882890.44922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.44938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.44958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.45005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.45022: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882890.45037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.45057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882890.45076: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882890.45089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882890.45100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.45112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.45126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.45138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.45147: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882890.45159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.45230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882890.45255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882890.45276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882890.45417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882890.47108: stdout chunk (state=3): >>>/root <<< 33932 1726882890.47304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882890.47307: stdout chunk (state=3): >>><<< 33932 1726882890.47310: stderr chunk (state=3): >>><<< 33932 1726882890.47373: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882890.47377: _low_level_execute_command(): starting 33932 1726882890.47381: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465 `" && echo ansible-tmp-1726882890.4733584-34510-158692172706465="` echo /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465 `" ) && sleep 0' 33932 1726882890.48060: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882890.48078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.48091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.48118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.48159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.48176: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882890.48189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.48203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882890.48222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882890.48234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882890.48245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.48255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.48273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.48284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.48294: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882890.48306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.48386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882890.48404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882890.48418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882890.48566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882890.50476: stdout chunk (state=3): >>>ansible-tmp-1726882890.4733584-34510-158692172706465=/root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465 <<< 33932 1726882890.50585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882890.50677: stderr chunk (state=3): >>><<< 33932 1726882890.50691: stdout chunk (state=3): >>><<< 33932 1726882890.50879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882890.4733584-34510-158692172706465=/root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882890.50882: variable 'ansible_module_compression' from source: unknown 33932 1726882890.50885: ANSIBALLZ: Using lock for service_facts 33932 1726882890.50887: ANSIBALLZ: Acquiring lock 33932 1726882890.50889: ANSIBALLZ: Lock acquired: 140301140044848 33932 1726882890.50891: ANSIBALLZ: Creating module 33932 1726882890.66381: ANSIBALLZ: Writing module into payload 33932 1726882890.66385: ANSIBALLZ: Writing module 33932 1726882890.66387: ANSIBALLZ: Renaming module 33932 1726882890.66390: ANSIBALLZ: Done creating module 33932 1726882890.66392: variable 'ansible_facts' from source: unknown 33932 1726882890.66394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465/AnsiballZ_service_facts.py 33932 1726882890.66396: Sending initial data 33932 1726882890.66398: Sent initial data (162 bytes) 33932 1726882890.67084: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882890.67091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.67095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.67097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.67099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.67102: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882890.67104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.67106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882890.67108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882890.67110: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882890.67115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.67125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.67136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.67144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.67151: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882890.67160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.67486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882890.67490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882890.67492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882890.67635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882890.69471: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882890.69550: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882890.69657: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp6nkmpnj4 /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465/AnsiballZ_service_facts.py <<< 33932 1726882890.69751: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882890.71147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882890.71249: stderr chunk (state=3): >>><<< 33932 1726882890.71252: stdout chunk (state=3): >>><<< 33932 1726882890.71273: done transferring module to remote 33932 1726882890.71284: _low_level_execute_command(): starting 33932 1726882890.71289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465/ /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465/AnsiballZ_service_facts.py && sleep 0' 33932 1726882890.72670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882890.72686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.72700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.72715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.72753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.72767: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882890.72784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.72801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882890.72812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882890.72821: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882890.72832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.72843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.72857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.72873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.72884: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882890.72898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.72978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882890.72995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882890.73012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882890.73140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882890.74957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882890.75017: stderr chunk (state=3): >>><<< 33932 1726882890.75020: stdout chunk (state=3): >>><<< 33932 1726882890.75114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882890.75117: _low_level_execute_command(): starting 33932 1726882890.75120: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465/AnsiballZ_service_facts.py && sleep 0' 33932 1726882890.75745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882890.75762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.75790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.75810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.75852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.75866: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882890.75895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.75914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882890.75926: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882890.75936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882890.75947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882890.75959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882890.75979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882890.75993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882890.76014: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882890.76028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882890.76116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882890.76141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882890.76157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882890.76299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882892.10750: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source"<<< 33932 1726882892.10784: stdout chunk (state=3): >>>: "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": <<< 33932 1726882892.10789: stdout chunk (state=3): >>>"alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 33932 1726882892.12101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882892.12105: stdout chunk (state=3): >>><<< 33932 1726882892.12111: stderr chunk (state=3): >>><<< 33932 1726882892.12135: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882892.12737: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882892.12743: _low_level_execute_command(): starting 33932 1726882892.12748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882890.4733584-34510-158692172706465/ > /dev/null 2>&1 && sleep 0' 33932 1726882892.14075: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882892.14516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.14526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.14541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.14582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.14622: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882892.14631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.14644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882892.14733: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882892.14738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882892.14747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.14754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.14771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.14774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.14782: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882892.14792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.14861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882892.14956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882892.14971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882892.15100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882892.16981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882892.16984: stdout chunk (state=3): >>><<< 33932 1726882892.16992: stderr chunk (state=3): >>><<< 33932 1726882892.17011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882892.17017: handler run complete 33932 1726882892.17198: variable 'ansible_facts' from source: unknown 33932 1726882892.17335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882892.18174: variable 'ansible_facts' from source: unknown 33932 1726882892.18304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882892.18701: attempt loop complete, returning result 33932 1726882892.18705: _execute() done 33932 1726882892.18707: dumping result to json 33932 1726882892.18761: done dumping result, returning 33932 1726882892.18773: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-615b-5c48-0000000004c4] 33932 1726882892.18776: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c4 33932 1726882892.19919: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c4 33932 1726882892.19922: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882892.19982: no more pending results, returning what we have 33932 1726882892.19985: results queue empty 33932 1726882892.19986: checking for any_errors_fatal 33932 1726882892.19990: done checking for any_errors_fatal 33932 1726882892.19991: checking for max_fail_percentage 33932 1726882892.19992: done checking for max_fail_percentage 33932 1726882892.19993: checking to see if all hosts have failed and the running result is not ok 33932 1726882892.19994: done checking to see if all hosts have failed 33932 1726882892.19995: getting the remaining hosts for this loop 33932 1726882892.19997: done getting the remaining hosts for this loop 33932 1726882892.20001: getting the next task for host managed_node1 33932 1726882892.20007: done getting next task for host managed_node1 33932 1726882892.20010: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 33932 1726882892.20014: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882892.20023: getting variables 33932 1726882892.20025: in VariableManager get_vars() 33932 1726882892.20062: Calling all_inventory to load vars for managed_node1 33932 1726882892.20067: Calling groups_inventory to load vars for managed_node1 33932 1726882892.20071: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882892.20081: Calling all_plugins_play to load vars for managed_node1 33932 1726882892.20084: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882892.20089: Calling groups_plugins_play to load vars for managed_node1 33932 1726882892.20439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882892.21627: done with get_vars() 33932 1726882892.21640: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:32 -0400 (0:00:01.797) 0:00:12.685 ****** 33932 1726882892.21741: entering _queue_task() for managed_node1/package_facts 33932 1726882892.21743: Creating lock for package_facts 33932 1726882892.22016: worker is 1 (out of 1 available) 33932 1726882892.22029: exiting _queue_task() for managed_node1/package_facts 33932 1726882892.22041: done queuing things up, now waiting for results queue to drain 33932 1726882892.22042: waiting for pending results... 33932 1726882892.22919: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 33932 1726882892.23071: in run() - task 0e448fcc-3ce9-615b-5c48-0000000004c5 33932 1726882892.23092: variable 'ansible_search_path' from source: unknown 33932 1726882892.23099: variable 'ansible_search_path' from source: unknown 33932 1726882892.23148: calling self._execute() 33932 1726882892.23246: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882892.23256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882892.23273: variable 'omit' from source: magic vars 33932 1726882892.23670: variable 'ansible_distribution_major_version' from source: facts 33932 1726882892.23691: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882892.23701: variable 'omit' from source: magic vars 33932 1726882892.23784: variable 'omit' from source: magic vars 33932 1726882892.23821: variable 'omit' from source: magic vars 33932 1726882892.23862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882892.23912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882892.23937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882892.23958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882892.23979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882892.24017: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882892.24025: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882892.24032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882892.24145: Set connection var ansible_shell_executable to /bin/sh 33932 1726882892.24158: Set connection var ansible_timeout to 10 33932 1726882892.24174: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882892.24185: Set connection var ansible_pipelining to False 33932 1726882892.24192: Set connection var ansible_connection to ssh 33932 1726882892.24198: Set connection var ansible_shell_type to sh 33932 1726882892.24232: variable 'ansible_shell_executable' from source: unknown 33932 1726882892.24240: variable 'ansible_connection' from source: unknown 33932 1726882892.24247: variable 'ansible_module_compression' from source: unknown 33932 1726882892.24253: variable 'ansible_shell_type' from source: unknown 33932 1726882892.24259: variable 'ansible_shell_executable' from source: unknown 33932 1726882892.24266: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882892.24278: variable 'ansible_pipelining' from source: unknown 33932 1726882892.24285: variable 'ansible_timeout' from source: unknown 33932 1726882892.24293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882892.24497: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882892.24512: variable 'omit' from source: magic vars 33932 1726882892.24522: starting attempt loop 33932 1726882892.24535: running the handler 33932 1726882892.24551: _low_level_execute_command(): starting 33932 1726882892.24562: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882892.25361: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882892.25382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.25400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.25423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.25470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.25483: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882892.25496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.25523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882892.25536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882892.25548: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882892.25561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.25582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.25599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.25613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.25630: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882892.25648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.25731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882892.25751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882892.25766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882892.25959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882892.27582: stdout chunk (state=3): >>>/root <<< 33932 1726882892.27777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882892.27780: stdout chunk (state=3): >>><<< 33932 1726882892.27783: stderr chunk (state=3): >>><<< 33932 1726882892.27867: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882892.27874: _low_level_execute_command(): starting 33932 1726882892.27879: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534 `" && echo ansible-tmp-1726882892.2780113-34588-258336029113534="` echo /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534 `" ) && sleep 0' 33932 1726882892.28842: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882892.28851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.28861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.28878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.28914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.28921: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882892.28930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.28943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882892.28950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882892.28956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882892.28965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.28975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.28986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.28997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.29000: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882892.29007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.29079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882892.29089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882892.29108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882892.29238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882892.31102: stdout chunk (state=3): >>>ansible-tmp-1726882892.2780113-34588-258336029113534=/root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534 <<< 33932 1726882892.31272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882892.31276: stderr chunk (state=3): >>><<< 33932 1726882892.31278: stdout chunk (state=3): >>><<< 33932 1726882892.31297: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882892.2780113-34588-258336029113534=/root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882892.31337: variable 'ansible_module_compression' from source: unknown 33932 1726882892.31389: ANSIBALLZ: Using lock for package_facts 33932 1726882892.31392: ANSIBALLZ: Acquiring lock 33932 1726882892.31395: ANSIBALLZ: Lock acquired: 140301138590608 33932 1726882892.31397: ANSIBALLZ: Creating module 33932 1726882892.57539: ANSIBALLZ: Writing module into payload 33932 1726882892.57648: ANSIBALLZ: Writing module 33932 1726882892.57677: ANSIBALLZ: Renaming module 33932 1726882892.57683: ANSIBALLZ: Done creating module 33932 1726882892.57699: variable 'ansible_facts' from source: unknown 33932 1726882892.57812: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534/AnsiballZ_package_facts.py 33932 1726882892.57924: Sending initial data 33932 1726882892.57927: Sent initial data (162 bytes) 33932 1726882892.58627: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.58640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.58671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.58674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882892.58685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.58690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.58699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.58705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.58766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882892.58776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882892.58779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882892.58901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882892.60757: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 33932 1726882892.60778: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882892.60867: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882892.60965: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp3ad_v7xc /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534/AnsiballZ_package_facts.py <<< 33932 1726882892.61056: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882892.62985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882892.63083: stderr chunk (state=3): >>><<< 33932 1726882892.63087: stdout chunk (state=3): >>><<< 33932 1726882892.63101: done transferring module to remote 33932 1726882892.63110: _low_level_execute_command(): starting 33932 1726882892.63114: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534/ /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534/AnsiballZ_package_facts.py && sleep 0' 33932 1726882892.63560: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.63568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.63603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882892.63609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.63623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.63629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.63693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882892.63696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882892.63699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882892.63793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882892.65562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882892.65704: stderr chunk (state=3): >>><<< 33932 1726882892.65707: stdout chunk (state=3): >>><<< 33932 1726882892.65709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882892.65712: _low_level_execute_command(): starting 33932 1726882892.65714: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534/AnsiballZ_package_facts.py && sleep 0' 33932 1726882892.66251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882892.66259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.66273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.66287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.66323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.66337: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882892.66358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.66377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882892.66380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882892.66395: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882892.66398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882892.66409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882892.66421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882892.66429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882892.66435: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882892.66444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882892.66559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882892.66583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882892.66603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882892.66721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882893.12918: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "li<<< 33932 1726882893.12980: stdout chunk (state=3): >>>bpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": <<< 33932 1726882893.13012: stdout chunk (state=3): >>>[{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "a<<< 33932 1726882893.13020: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86<<< 33932 1726882893.13042: stdout chunk (state=3): >>>_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3"<<< 33932 1726882893.13049: stdout chunk (state=3): >>>, "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}<<< 33932 1726882893.13052: stdout chunk (state=3): >>>], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}],<<< 33932 1726882893.13091: stdout chunk (state=3): >>> "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146<<< 33932 1726882893.13095: stdout chunk (state=3): >>>.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "<<< 33932 1726882893.13102: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "e<<< 33932 1726882893.13105: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 33932 1726882893.14681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882893.14684: stdout chunk (state=3): >>><<< 33932 1726882893.14686: stderr chunk (state=3): >>><<< 33932 1726882893.15378: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882893.17838: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882893.17873: _low_level_execute_command(): starting 33932 1726882893.17886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882892.2780113-34588-258336029113534/ > /dev/null 2>&1 && sleep 0' 33932 1726882893.18561: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882893.18583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882893.18599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882893.18618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882893.18662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882893.18681: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882893.18696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882893.18713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882893.18724: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882893.18738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882893.18750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882893.18765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882893.18786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882893.18797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882893.18807: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882893.18819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882893.18899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882893.18915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882893.18929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882893.19183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882893.21000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882893.21004: stdout chunk (state=3): >>><<< 33932 1726882893.21006: stderr chunk (state=3): >>><<< 33932 1726882893.21461: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882893.21466: handler run complete 33932 1726882893.22305: variable 'ansible_facts' from source: unknown 33932 1726882893.23369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.25496: variable 'ansible_facts' from source: unknown 33932 1726882893.25759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.26211: attempt loop complete, returning result 33932 1726882893.26223: _execute() done 33932 1726882893.26226: dumping result to json 33932 1726882893.26355: done dumping result, returning 33932 1726882893.26365: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-615b-5c48-0000000004c5] 33932 1726882893.26373: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c5 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882893.34337: no more pending results, returning what we have 33932 1726882893.34340: results queue empty 33932 1726882893.34341: checking for any_errors_fatal 33932 1726882893.34344: done checking for any_errors_fatal 33932 1726882893.34345: checking for max_fail_percentage 33932 1726882893.34346: done checking for max_fail_percentage 33932 1726882893.34347: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.34348: done checking to see if all hosts have failed 33932 1726882893.34349: getting the remaining hosts for this loop 33932 1726882893.34350: done getting the remaining hosts for this loop 33932 1726882893.34355: getting the next task for host managed_node1 33932 1726882893.34362: done getting next task for host managed_node1 33932 1726882893.34366: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33932 1726882893.34369: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.34378: getting variables 33932 1726882893.34380: in VariableManager get_vars() 33932 1726882893.34412: Calling all_inventory to load vars for managed_node1 33932 1726882893.34415: Calling groups_inventory to load vars for managed_node1 33932 1726882893.34418: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.34427: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.34430: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.34433: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.35518: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000004c5 33932 1726882893.35522: WORKER PROCESS EXITING 33932 1726882893.36041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.37523: done with get_vars() 33932 1726882893.37542: done getting variables 33932 1726882893.37590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:33 -0400 (0:00:01.158) 0:00:13.843 ****** 33932 1726882893.37613: entering _queue_task() for managed_node1/debug 33932 1726882893.37888: worker is 1 (out of 1 available) 33932 1726882893.37903: exiting _queue_task() for managed_node1/debug 33932 1726882893.37917: done queuing things up, now waiting for results queue to drain 33932 1726882893.37942: waiting for pending results... 33932 1726882893.38136: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 33932 1726882893.38876: in run() - task 0e448fcc-3ce9-615b-5c48-000000000017 33932 1726882893.38879: variable 'ansible_search_path' from source: unknown 33932 1726882893.38882: variable 'ansible_search_path' from source: unknown 33932 1726882893.38885: calling self._execute() 33932 1726882893.38888: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.38890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.38892: variable 'omit' from source: magic vars 33932 1726882893.38894: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.38897: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.38899: variable 'omit' from source: magic vars 33932 1726882893.38901: variable 'omit' from source: magic vars 33932 1726882893.39079: variable 'network_provider' from source: set_fact 33932 1726882893.39083: variable 'omit' from source: magic vars 33932 1726882893.39086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882893.39088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882893.39091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882893.39093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882893.39095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882893.39116: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882893.39119: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.39123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.39211: Set connection var ansible_shell_executable to /bin/sh 33932 1726882893.39217: Set connection var ansible_timeout to 10 33932 1726882893.39222: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882893.39227: Set connection var ansible_pipelining to False 33932 1726882893.39230: Set connection var ansible_connection to ssh 33932 1726882893.39232: Set connection var ansible_shell_type to sh 33932 1726882893.39249: variable 'ansible_shell_executable' from source: unknown 33932 1726882893.39253: variable 'ansible_connection' from source: unknown 33932 1726882893.39255: variable 'ansible_module_compression' from source: unknown 33932 1726882893.39258: variable 'ansible_shell_type' from source: unknown 33932 1726882893.39260: variable 'ansible_shell_executable' from source: unknown 33932 1726882893.39266: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.39274: variable 'ansible_pipelining' from source: unknown 33932 1726882893.39285: variable 'ansible_timeout' from source: unknown 33932 1726882893.39293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.39412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882893.39427: variable 'omit' from source: magic vars 33932 1726882893.39438: starting attempt loop 33932 1726882893.39443: running the handler 33932 1726882893.39498: handler run complete 33932 1726882893.39524: attempt loop complete, returning result 33932 1726882893.39533: _execute() done 33932 1726882893.39541: dumping result to json 33932 1726882893.39549: done dumping result, returning 33932 1726882893.39561: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-615b-5c48-000000000017] 33932 1726882893.39610: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000017 ok: [managed_node1] => {} MSG: Using network provider: nm 33932 1726882893.40053: no more pending results, returning what we have 33932 1726882893.40057: results queue empty 33932 1726882893.40058: checking for any_errors_fatal 33932 1726882893.40078: done checking for any_errors_fatal 33932 1726882893.40079: checking for max_fail_percentage 33932 1726882893.40081: done checking for max_fail_percentage 33932 1726882893.40082: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.40083: done checking to see if all hosts have failed 33932 1726882893.40083: getting the remaining hosts for this loop 33932 1726882893.40085: done getting the remaining hosts for this loop 33932 1726882893.40089: getting the next task for host managed_node1 33932 1726882893.40095: done getting next task for host managed_node1 33932 1726882893.40099: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33932 1726882893.40102: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.40112: getting variables 33932 1726882893.40114: in VariableManager get_vars() 33932 1726882893.40152: Calling all_inventory to load vars for managed_node1 33932 1726882893.40155: Calling groups_inventory to load vars for managed_node1 33932 1726882893.40158: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.40171: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.40292: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.40302: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000017 33932 1726882893.40305: WORKER PROCESS EXITING 33932 1726882893.40310: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.41921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.43060: done with get_vars() 33932 1726882893.43087: done getting variables 33932 1726882893.43154: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:33 -0400 (0:00:00.055) 0:00:13.899 ****** 33932 1726882893.43188: entering _queue_task() for managed_node1/fail 33932 1726882893.43517: worker is 1 (out of 1 available) 33932 1726882893.43561: exiting _queue_task() for managed_node1/fail 33932 1726882893.43581: done queuing things up, now waiting for results queue to drain 33932 1726882893.43587: waiting for pending results... 33932 1726882893.43904: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33932 1726882893.43995: in run() - task 0e448fcc-3ce9-615b-5c48-000000000018 33932 1726882893.43999: variable 'ansible_search_path' from source: unknown 33932 1726882893.44002: variable 'ansible_search_path' from source: unknown 33932 1726882893.44038: calling self._execute() 33932 1726882893.44107: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.44112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.44121: variable 'omit' from source: magic vars 33932 1726882893.44399: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.44408: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.44495: variable 'network_state' from source: role '' defaults 33932 1726882893.44503: Evaluated conditional (network_state != {}): False 33932 1726882893.44507: when evaluation is False, skipping this task 33932 1726882893.44517: _execute() done 33932 1726882893.44522: dumping result to json 33932 1726882893.44525: done dumping result, returning 33932 1726882893.44530: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-615b-5c48-000000000018] 33932 1726882893.44581: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000018 33932 1726882893.44675: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000018 33932 1726882893.44690: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882893.44739: no more pending results, returning what we have 33932 1726882893.44742: results queue empty 33932 1726882893.44743: checking for any_errors_fatal 33932 1726882893.44749: done checking for any_errors_fatal 33932 1726882893.44750: checking for max_fail_percentage 33932 1726882893.44752: done checking for max_fail_percentage 33932 1726882893.44753: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.44754: done checking to see if all hosts have failed 33932 1726882893.44755: getting the remaining hosts for this loop 33932 1726882893.44756: done getting the remaining hosts for this loop 33932 1726882893.44760: getting the next task for host managed_node1 33932 1726882893.44769: done getting next task for host managed_node1 33932 1726882893.44773: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33932 1726882893.44778: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.44793: getting variables 33932 1726882893.44796: in VariableManager get_vars() 33932 1726882893.44836: Calling all_inventory to load vars for managed_node1 33932 1726882893.44839: Calling groups_inventory to load vars for managed_node1 33932 1726882893.44841: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.44853: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.44856: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.44859: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.46576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.48441: done with get_vars() 33932 1726882893.48477: done getting variables 33932 1726882893.48537: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:33 -0400 (0:00:00.053) 0:00:13.953 ****** 33932 1726882893.48581: entering _queue_task() for managed_node1/fail 33932 1726882893.48877: worker is 1 (out of 1 available) 33932 1726882893.48899: exiting _queue_task() for managed_node1/fail 33932 1726882893.48909: done queuing things up, now waiting for results queue to drain 33932 1726882893.48911: waiting for pending results... 33932 1726882893.49201: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33932 1726882893.49349: in run() - task 0e448fcc-3ce9-615b-5c48-000000000019 33932 1726882893.49371: variable 'ansible_search_path' from source: unknown 33932 1726882893.49379: variable 'ansible_search_path' from source: unknown 33932 1726882893.49419: calling self._execute() 33932 1726882893.49521: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.49532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.49557: variable 'omit' from source: magic vars 33932 1726882893.49947: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.49966: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.50117: variable 'network_state' from source: role '' defaults 33932 1726882893.50132: Evaluated conditional (network_state != {}): False 33932 1726882893.50139: when evaluation is False, skipping this task 33932 1726882893.50146: _execute() done 33932 1726882893.50154: dumping result to json 33932 1726882893.50161: done dumping result, returning 33932 1726882893.50174: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-615b-5c48-000000000019] 33932 1726882893.50185: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000019 33932 1726882893.50308: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000019 33932 1726882893.50323: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882893.50375: no more pending results, returning what we have 33932 1726882893.50379: results queue empty 33932 1726882893.50380: checking for any_errors_fatal 33932 1726882893.50388: done checking for any_errors_fatal 33932 1726882893.50388: checking for max_fail_percentage 33932 1726882893.50390: done checking for max_fail_percentage 33932 1726882893.50391: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.50392: done checking to see if all hosts have failed 33932 1726882893.50393: getting the remaining hosts for this loop 33932 1726882893.50395: done getting the remaining hosts for this loop 33932 1726882893.50399: getting the next task for host managed_node1 33932 1726882893.50406: done getting next task for host managed_node1 33932 1726882893.50411: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33932 1726882893.50415: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.50430: getting variables 33932 1726882893.50432: in VariableManager get_vars() 33932 1726882893.50475: Calling all_inventory to load vars for managed_node1 33932 1726882893.50478: Calling groups_inventory to load vars for managed_node1 33932 1726882893.50481: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.50493: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.50496: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.50499: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.52371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.54195: done with get_vars() 33932 1726882893.54219: done getting variables 33932 1726882893.54293: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:33 -0400 (0:00:00.057) 0:00:14.010 ****** 33932 1726882893.54326: entering _queue_task() for managed_node1/fail 33932 1726882893.54635: worker is 1 (out of 1 available) 33932 1726882893.54648: exiting _queue_task() for managed_node1/fail 33932 1726882893.54659: done queuing things up, now waiting for results queue to drain 33932 1726882893.54661: waiting for pending results... 33932 1726882893.54959: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33932 1726882893.55104: in run() - task 0e448fcc-3ce9-615b-5c48-00000000001a 33932 1726882893.55126: variable 'ansible_search_path' from source: unknown 33932 1726882893.55144: variable 'ansible_search_path' from source: unknown 33932 1726882893.55187: calling self._execute() 33932 1726882893.55290: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.55300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.55313: variable 'omit' from source: magic vars 33932 1726882893.55714: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.55732: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.55927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882893.58570: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882893.58665: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882893.58722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882893.58766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882893.58799: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882893.58895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.58944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.58980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.59032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.59062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.59171: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.59191: Evaluated conditional (ansible_distribution_major_version | int > 9): False 33932 1726882893.59198: when evaluation is False, skipping this task 33932 1726882893.59205: _execute() done 33932 1726882893.59211: dumping result to json 33932 1726882893.59218: done dumping result, returning 33932 1726882893.59227: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-615b-5c48-00000000001a] 33932 1726882893.59237: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 33932 1726882893.59399: no more pending results, returning what we have 33932 1726882893.59403: results queue empty 33932 1726882893.59404: checking for any_errors_fatal 33932 1726882893.59410: done checking for any_errors_fatal 33932 1726882893.59410: checking for max_fail_percentage 33932 1726882893.59412: done checking for max_fail_percentage 33932 1726882893.59413: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.59414: done checking to see if all hosts have failed 33932 1726882893.59415: getting the remaining hosts for this loop 33932 1726882893.59417: done getting the remaining hosts for this loop 33932 1726882893.59421: getting the next task for host managed_node1 33932 1726882893.59427: done getting next task for host managed_node1 33932 1726882893.59431: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33932 1726882893.59434: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.59448: getting variables 33932 1726882893.59450: in VariableManager get_vars() 33932 1726882893.59496: Calling all_inventory to load vars for managed_node1 33932 1726882893.59499: Calling groups_inventory to load vars for managed_node1 33932 1726882893.59502: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.59512: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.59515: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.59518: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.60083: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001a 33932 1726882893.60086: WORKER PROCESS EXITING 33932 1726882893.61279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.63150: done with get_vars() 33932 1726882893.63175: done getting variables 33932 1726882893.63275: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:33 -0400 (0:00:00.089) 0:00:14.100 ****** 33932 1726882893.63305: entering _queue_task() for managed_node1/dnf 33932 1726882893.63590: worker is 1 (out of 1 available) 33932 1726882893.63605: exiting _queue_task() for managed_node1/dnf 33932 1726882893.63622: done queuing things up, now waiting for results queue to drain 33932 1726882893.63625: waiting for pending results... 33932 1726882893.63868: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33932 1726882893.63947: in run() - task 0e448fcc-3ce9-615b-5c48-00000000001b 33932 1726882893.63958: variable 'ansible_search_path' from source: unknown 33932 1726882893.63962: variable 'ansible_search_path' from source: unknown 33932 1726882893.63996: calling self._execute() 33932 1726882893.64062: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.64068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.64078: variable 'omit' from source: magic vars 33932 1726882893.64358: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.64371: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.64505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882893.66091: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882893.66143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882893.66170: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882893.66197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882893.66217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882893.66276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.66297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.66318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.66386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.66390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.66493: variable 'ansible_distribution' from source: facts 33932 1726882893.66496: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.66501: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 33932 1726882893.66597: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882893.66697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.66700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.66716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.66745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.66756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.67351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.67354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.67356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.67359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.67361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.67363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.67367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.67370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.67372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.67374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.67379: variable 'network_connections' from source: task vars 33932 1726882893.67381: variable 'interface' from source: play vars 33932 1726882893.67383: variable 'interface' from source: play vars 33932 1726882893.67386: variable 'vlan_interface' from source: play vars 33932 1726882893.67388: variable 'vlan_interface' from source: play vars 33932 1726882893.67390: variable 'interface' from source: play vars 33932 1726882893.67876: variable 'interface' from source: play vars 33932 1726882893.67880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882893.67882: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882893.67885: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882893.67887: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882893.67889: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882893.67891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882893.67900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882893.67903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.67905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882893.67992: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882893.68139: variable 'network_connections' from source: task vars 33932 1726882893.68143: variable 'interface' from source: play vars 33932 1726882893.68197: variable 'interface' from source: play vars 33932 1726882893.68209: variable 'vlan_interface' from source: play vars 33932 1726882893.68247: variable 'vlan_interface' from source: play vars 33932 1726882893.68253: variable 'interface' from source: play vars 33932 1726882893.68317: variable 'interface' from source: play vars 33932 1726882893.68335: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 33932 1726882893.68338: when evaluation is False, skipping this task 33932 1726882893.68341: _execute() done 33932 1726882893.68343: dumping result to json 33932 1726882893.68347: done dumping result, returning 33932 1726882893.68354: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-00000000001b] 33932 1726882893.68359: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001b 33932 1726882893.68457: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001b 33932 1726882893.68460: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 33932 1726882893.68698: no more pending results, returning what we have 33932 1726882893.68702: results queue empty 33932 1726882893.68703: checking for any_errors_fatal 33932 1726882893.68707: done checking for any_errors_fatal 33932 1726882893.68708: checking for max_fail_percentage 33932 1726882893.68710: done checking for max_fail_percentage 33932 1726882893.68711: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.68711: done checking to see if all hosts have failed 33932 1726882893.68712: getting the remaining hosts for this loop 33932 1726882893.68714: done getting the remaining hosts for this loop 33932 1726882893.68717: getting the next task for host managed_node1 33932 1726882893.68723: done getting next task for host managed_node1 33932 1726882893.68727: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33932 1726882893.68729: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.68741: getting variables 33932 1726882893.68743: in VariableManager get_vars() 33932 1726882893.68785: Calling all_inventory to load vars for managed_node1 33932 1726882893.68788: Calling groups_inventory to load vars for managed_node1 33932 1726882893.68790: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.68799: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.68802: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.68805: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.70019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.70967: done with get_vars() 33932 1726882893.70986: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33932 1726882893.71036: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:33 -0400 (0:00:00.077) 0:00:14.178 ****** 33932 1726882893.71057: entering _queue_task() for managed_node1/yum 33932 1726882893.71058: Creating lock for yum 33932 1726882893.71278: worker is 1 (out of 1 available) 33932 1726882893.71292: exiting _queue_task() for managed_node1/yum 33932 1726882893.71302: done queuing things up, now waiting for results queue to drain 33932 1726882893.71304: waiting for pending results... 33932 1726882893.71483: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33932 1726882893.71553: in run() - task 0e448fcc-3ce9-615b-5c48-00000000001c 33932 1726882893.71565: variable 'ansible_search_path' from source: unknown 33932 1726882893.71573: variable 'ansible_search_path' from source: unknown 33932 1726882893.71601: calling self._execute() 33932 1726882893.71671: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.71675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.71682: variable 'omit' from source: magic vars 33932 1726882893.71941: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.71954: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.72075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882893.74395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882893.74440: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882893.74471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882893.74503: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882893.74525: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882893.74584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.74602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.74623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.74651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.74662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.74728: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.74739: Evaluated conditional (ansible_distribution_major_version | int < 8): False 33932 1726882893.74744: when evaluation is False, skipping this task 33932 1726882893.74747: _execute() done 33932 1726882893.74749: dumping result to json 33932 1726882893.74752: done dumping result, returning 33932 1726882893.74761: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-00000000001c] 33932 1726882893.74766: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001c 33932 1726882893.74848: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001c 33932 1726882893.74851: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 33932 1726882893.74914: no more pending results, returning what we have 33932 1726882893.74918: results queue empty 33932 1726882893.74919: checking for any_errors_fatal 33932 1726882893.74924: done checking for any_errors_fatal 33932 1726882893.74924: checking for max_fail_percentage 33932 1726882893.74926: done checking for max_fail_percentage 33932 1726882893.74927: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.74927: done checking to see if all hosts have failed 33932 1726882893.74928: getting the remaining hosts for this loop 33932 1726882893.74930: done getting the remaining hosts for this loop 33932 1726882893.74933: getting the next task for host managed_node1 33932 1726882893.74938: done getting next task for host managed_node1 33932 1726882893.74942: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33932 1726882893.74945: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.74957: getting variables 33932 1726882893.74959: in VariableManager get_vars() 33932 1726882893.74999: Calling all_inventory to load vars for managed_node1 33932 1726882893.75002: Calling groups_inventory to load vars for managed_node1 33932 1726882893.75004: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.75012: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.75014: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.75017: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.75891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.76817: done with get_vars() 33932 1726882893.76831: done getting variables 33932 1726882893.76874: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:33 -0400 (0:00:00.058) 0:00:14.236 ****** 33932 1726882893.76897: entering _queue_task() for managed_node1/fail 33932 1726882893.77096: worker is 1 (out of 1 available) 33932 1726882893.77110: exiting _queue_task() for managed_node1/fail 33932 1726882893.77121: done queuing things up, now waiting for results queue to drain 33932 1726882893.77123: waiting for pending results... 33932 1726882893.77295: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33932 1726882893.77379: in run() - task 0e448fcc-3ce9-615b-5c48-00000000001d 33932 1726882893.77390: variable 'ansible_search_path' from source: unknown 33932 1726882893.77394: variable 'ansible_search_path' from source: unknown 33932 1726882893.77423: calling self._execute() 33932 1726882893.77496: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.77499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.77507: variable 'omit' from source: magic vars 33932 1726882893.77767: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.77781: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.77857: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882893.77995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882893.79547: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882893.79598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882893.79627: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882893.79652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882893.79675: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882893.79732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.79753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.79773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.79799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.79809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.79841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.79859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.79880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.79904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.79915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.79943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.79962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.79981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.80005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.80015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.80128: variable 'network_connections' from source: task vars 33932 1726882893.80136: variable 'interface' from source: play vars 33932 1726882893.80190: variable 'interface' from source: play vars 33932 1726882893.80200: variable 'vlan_interface' from source: play vars 33932 1726882893.80242: variable 'vlan_interface' from source: play vars 33932 1726882893.80247: variable 'interface' from source: play vars 33932 1726882893.80294: variable 'interface' from source: play vars 33932 1726882893.80341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882893.80451: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882893.80480: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882893.80514: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882893.80536: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882893.80567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882893.80583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882893.80602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.80623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882893.80671: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882893.80816: variable 'network_connections' from source: task vars 33932 1726882893.80821: variable 'interface' from source: play vars 33932 1726882893.80866: variable 'interface' from source: play vars 33932 1726882893.80874: variable 'vlan_interface' from source: play vars 33932 1726882893.80915: variable 'vlan_interface' from source: play vars 33932 1726882893.80920: variable 'interface' from source: play vars 33932 1726882893.80966: variable 'interface' from source: play vars 33932 1726882893.80992: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 33932 1726882893.81002: when evaluation is False, skipping this task 33932 1726882893.81005: _execute() done 33932 1726882893.81007: dumping result to json 33932 1726882893.81010: done dumping result, returning 33932 1726882893.81012: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-00000000001d] 33932 1726882893.81014: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001d 33932 1726882893.81107: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001d 33932 1726882893.81110: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 33932 1726882893.81165: no more pending results, returning what we have 33932 1726882893.81171: results queue empty 33932 1726882893.81172: checking for any_errors_fatal 33932 1726882893.81177: done checking for any_errors_fatal 33932 1726882893.81177: checking for max_fail_percentage 33932 1726882893.81179: done checking for max_fail_percentage 33932 1726882893.81180: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.81181: done checking to see if all hosts have failed 33932 1726882893.81181: getting the remaining hosts for this loop 33932 1726882893.81183: done getting the remaining hosts for this loop 33932 1726882893.81186: getting the next task for host managed_node1 33932 1726882893.81192: done getting next task for host managed_node1 33932 1726882893.81195: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33932 1726882893.81198: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.81209: getting variables 33932 1726882893.81211: in VariableManager get_vars() 33932 1726882893.81243: Calling all_inventory to load vars for managed_node1 33932 1726882893.81246: Calling groups_inventory to load vars for managed_node1 33932 1726882893.81248: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.81255: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.81265: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.81274: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.82048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.83071: done with get_vars() 33932 1726882893.83087: done getting variables 33932 1726882893.83128: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:33 -0400 (0:00:00.062) 0:00:14.299 ****** 33932 1726882893.83150: entering _queue_task() for managed_node1/package 33932 1726882893.83335: worker is 1 (out of 1 available) 33932 1726882893.83348: exiting _queue_task() for managed_node1/package 33932 1726882893.83360: done queuing things up, now waiting for results queue to drain 33932 1726882893.83361: waiting for pending results... 33932 1726882893.83539: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 33932 1726882893.83624: in run() - task 0e448fcc-3ce9-615b-5c48-00000000001e 33932 1726882893.83635: variable 'ansible_search_path' from source: unknown 33932 1726882893.83638: variable 'ansible_search_path' from source: unknown 33932 1726882893.83674: calling self._execute() 33932 1726882893.83737: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.83741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.83749: variable 'omit' from source: magic vars 33932 1726882893.84018: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.84028: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.84156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882893.84347: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882893.84380: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882893.84406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882893.84431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882893.84515: variable 'network_packages' from source: role '' defaults 33932 1726882893.84587: variable '__network_provider_setup' from source: role '' defaults 33932 1726882893.84595: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882893.84646: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882893.84653: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882893.84700: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882893.84816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882893.89200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882893.89238: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882893.89261: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882893.89291: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882893.89309: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882893.89356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.89379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.89399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.89427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.89439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.89467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.89487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.89507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.89533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.89543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.89692: variable '__network_packages_default_gobject_packages' from source: role '' defaults 33932 1726882893.89767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.89786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.89802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.89830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.89843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.89905: variable 'ansible_python' from source: facts 33932 1726882893.89921: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 33932 1726882893.89982: variable '__network_wpa_supplicant_required' from source: role '' defaults 33932 1726882893.90036: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 33932 1726882893.90121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.90138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.90158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.90188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.90198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.90230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882893.90249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882893.90270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.90298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882893.90310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882893.90404: variable 'network_connections' from source: task vars 33932 1726882893.90408: variable 'interface' from source: play vars 33932 1726882893.90481: variable 'interface' from source: play vars 33932 1726882893.90491: variable 'vlan_interface' from source: play vars 33932 1726882893.90560: variable 'vlan_interface' from source: play vars 33932 1726882893.90569: variable 'interface' from source: play vars 33932 1726882893.90639: variable 'interface' from source: play vars 33932 1726882893.90693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882893.90717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882893.90736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882893.90757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882893.90788: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882893.90968: variable 'network_connections' from source: task vars 33932 1726882893.90975: variable 'interface' from source: play vars 33932 1726882893.91047: variable 'interface' from source: play vars 33932 1726882893.91054: variable 'vlan_interface' from source: play vars 33932 1726882893.91125: variable 'vlan_interface' from source: play vars 33932 1726882893.91134: variable 'interface' from source: play vars 33932 1726882893.91206: variable 'interface' from source: play vars 33932 1726882893.91243: variable '__network_packages_default_wireless' from source: role '' defaults 33932 1726882893.91300: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882893.91497: variable 'network_connections' from source: task vars 33932 1726882893.91501: variable 'interface' from source: play vars 33932 1726882893.91546: variable 'interface' from source: play vars 33932 1726882893.91554: variable 'vlan_interface' from source: play vars 33932 1726882893.91604: variable 'vlan_interface' from source: play vars 33932 1726882893.91610: variable 'interface' from source: play vars 33932 1726882893.91654: variable 'interface' from source: play vars 33932 1726882893.91681: variable '__network_packages_default_team' from source: role '' defaults 33932 1726882893.91735: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882893.91932: variable 'network_connections' from source: task vars 33932 1726882893.91935: variable 'interface' from source: play vars 33932 1726882893.91985: variable 'interface' from source: play vars 33932 1726882893.91992: variable 'vlan_interface' from source: play vars 33932 1726882893.92041: variable 'vlan_interface' from source: play vars 33932 1726882893.92046: variable 'interface' from source: play vars 33932 1726882893.92095: variable 'interface' from source: play vars 33932 1726882893.92139: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882893.92184: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882893.92190: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882893.92233: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882893.92367: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 33932 1726882893.92659: variable 'network_connections' from source: task vars 33932 1726882893.92667: variable 'interface' from source: play vars 33932 1726882893.92711: variable 'interface' from source: play vars 33932 1726882893.92719: variable 'vlan_interface' from source: play vars 33932 1726882893.92762: variable 'vlan_interface' from source: play vars 33932 1726882893.92766: variable 'interface' from source: play vars 33932 1726882893.92811: variable 'interface' from source: play vars 33932 1726882893.92819: variable 'ansible_distribution' from source: facts 33932 1726882893.92822: variable '__network_rh_distros' from source: role '' defaults 33932 1726882893.92827: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.92844: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 33932 1726882893.92962: variable 'ansible_distribution' from source: facts 33932 1726882893.92967: variable '__network_rh_distros' from source: role '' defaults 33932 1726882893.92979: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.92990: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 33932 1726882893.93099: variable 'ansible_distribution' from source: facts 33932 1726882893.93103: variable '__network_rh_distros' from source: role '' defaults 33932 1726882893.93109: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.93130: variable 'network_provider' from source: set_fact 33932 1726882893.93142: variable 'ansible_facts' from source: unknown 33932 1726882893.93521: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 33932 1726882893.93524: when evaluation is False, skipping this task 33932 1726882893.93531: _execute() done 33932 1726882893.93534: dumping result to json 33932 1726882893.93540: done dumping result, returning 33932 1726882893.93543: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-615b-5c48-00000000001e] 33932 1726882893.93545: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001e 33932 1726882893.93628: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001e 33932 1726882893.93631: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 33932 1726882893.93680: no more pending results, returning what we have 33932 1726882893.93684: results queue empty 33932 1726882893.93685: checking for any_errors_fatal 33932 1726882893.93691: done checking for any_errors_fatal 33932 1726882893.93692: checking for max_fail_percentage 33932 1726882893.93694: done checking for max_fail_percentage 33932 1726882893.93695: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.93695: done checking to see if all hosts have failed 33932 1726882893.93696: getting the remaining hosts for this loop 33932 1726882893.93697: done getting the remaining hosts for this loop 33932 1726882893.93701: getting the next task for host managed_node1 33932 1726882893.93706: done getting next task for host managed_node1 33932 1726882893.93714: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33932 1726882893.93716: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.93730: getting variables 33932 1726882893.93731: in VariableManager get_vars() 33932 1726882893.93782: Calling all_inventory to load vars for managed_node1 33932 1726882893.93784: Calling groups_inventory to load vars for managed_node1 33932 1726882893.93787: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.93796: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.93798: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.93801: Calling groups_plugins_play to load vars for managed_node1 33932 1726882893.97143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882893.98065: done with get_vars() 33932 1726882893.98084: done getting variables 33932 1726882893.98119: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:33 -0400 (0:00:00.149) 0:00:14.449 ****** 33932 1726882893.98139: entering _queue_task() for managed_node1/package 33932 1726882893.98374: worker is 1 (out of 1 available) 33932 1726882893.98388: exiting _queue_task() for managed_node1/package 33932 1726882893.98400: done queuing things up, now waiting for results queue to drain 33932 1726882893.98402: waiting for pending results... 33932 1726882893.98584: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33932 1726882893.98659: in run() - task 0e448fcc-3ce9-615b-5c48-00000000001f 33932 1726882893.98674: variable 'ansible_search_path' from source: unknown 33932 1726882893.98679: variable 'ansible_search_path' from source: unknown 33932 1726882893.98706: calling self._execute() 33932 1726882893.98778: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882893.98782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882893.98790: variable 'omit' from source: magic vars 33932 1726882893.99064: variable 'ansible_distribution_major_version' from source: facts 33932 1726882893.99075: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882893.99157: variable 'network_state' from source: role '' defaults 33932 1726882893.99167: Evaluated conditional (network_state != {}): False 33932 1726882893.99173: when evaluation is False, skipping this task 33932 1726882893.99178: _execute() done 33932 1726882893.99180: dumping result to json 33932 1726882893.99183: done dumping result, returning 33932 1726882893.99186: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-615b-5c48-00000000001f] 33932 1726882893.99195: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001f 33932 1726882893.99281: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000001f 33932 1726882893.99284: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882893.99349: no more pending results, returning what we have 33932 1726882893.99352: results queue empty 33932 1726882893.99353: checking for any_errors_fatal 33932 1726882893.99359: done checking for any_errors_fatal 33932 1726882893.99360: checking for max_fail_percentage 33932 1726882893.99361: done checking for max_fail_percentage 33932 1726882893.99362: checking to see if all hosts have failed and the running result is not ok 33932 1726882893.99364: done checking to see if all hosts have failed 33932 1726882893.99365: getting the remaining hosts for this loop 33932 1726882893.99367: done getting the remaining hosts for this loop 33932 1726882893.99373: getting the next task for host managed_node1 33932 1726882893.99383: done getting next task for host managed_node1 33932 1726882893.99386: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33932 1726882893.99389: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882893.99401: getting variables 33932 1726882893.99403: in VariableManager get_vars() 33932 1726882893.99441: Calling all_inventory to load vars for managed_node1 33932 1726882893.99444: Calling groups_inventory to load vars for managed_node1 33932 1726882893.99446: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882893.99454: Calling all_plugins_play to load vars for managed_node1 33932 1726882893.99456: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882893.99458: Calling groups_plugins_play to load vars for managed_node1 33932 1726882894.00335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882894.01295: done with get_vars() 33932 1726882894.01308: done getting variables 33932 1726882894.01349: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:34 -0400 (0:00:00.032) 0:00:14.481 ****** 33932 1726882894.01375: entering _queue_task() for managed_node1/package 33932 1726882894.01566: worker is 1 (out of 1 available) 33932 1726882894.01582: exiting _queue_task() for managed_node1/package 33932 1726882894.01594: done queuing things up, now waiting for results queue to drain 33932 1726882894.01596: waiting for pending results... 33932 1726882894.01752: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33932 1726882894.01831: in run() - task 0e448fcc-3ce9-615b-5c48-000000000020 33932 1726882894.01841: variable 'ansible_search_path' from source: unknown 33932 1726882894.01845: variable 'ansible_search_path' from source: unknown 33932 1726882894.01876: calling self._execute() 33932 1726882894.01940: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.01944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.01952: variable 'omit' from source: magic vars 33932 1726882894.02281: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.02294: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882894.02407: variable 'network_state' from source: role '' defaults 33932 1726882894.02416: Evaluated conditional (network_state != {}): False 33932 1726882894.02419: when evaluation is False, skipping this task 33932 1726882894.02422: _execute() done 33932 1726882894.02424: dumping result to json 33932 1726882894.02428: done dumping result, returning 33932 1726882894.02435: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-615b-5c48-000000000020] 33932 1726882894.02447: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000020 33932 1726882894.02529: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000020 33932 1726882894.02532: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882894.02601: no more pending results, returning what we have 33932 1726882894.02603: results queue empty 33932 1726882894.02604: checking for any_errors_fatal 33932 1726882894.02609: done checking for any_errors_fatal 33932 1726882894.02609: checking for max_fail_percentage 33932 1726882894.02611: done checking for max_fail_percentage 33932 1726882894.02612: checking to see if all hosts have failed and the running result is not ok 33932 1726882894.02612: done checking to see if all hosts have failed 33932 1726882894.02613: getting the remaining hosts for this loop 33932 1726882894.02614: done getting the remaining hosts for this loop 33932 1726882894.02617: getting the next task for host managed_node1 33932 1726882894.02621: done getting next task for host managed_node1 33932 1726882894.02625: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33932 1726882894.02627: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882894.02639: getting variables 33932 1726882894.02641: in VariableManager get_vars() 33932 1726882894.02689: Calling all_inventory to load vars for managed_node1 33932 1726882894.02692: Calling groups_inventory to load vars for managed_node1 33932 1726882894.02695: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882894.02703: Calling all_plugins_play to load vars for managed_node1 33932 1726882894.02706: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882894.02708: Calling groups_plugins_play to load vars for managed_node1 33932 1726882894.04181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882894.05637: done with get_vars() 33932 1726882894.05651: done getting variables 33932 1726882894.05727: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:34 -0400 (0:00:00.043) 0:00:14.525 ****** 33932 1726882894.05749: entering _queue_task() for managed_node1/service 33932 1726882894.05750: Creating lock for service 33932 1726882894.05954: worker is 1 (out of 1 available) 33932 1726882894.05972: exiting _queue_task() for managed_node1/service 33932 1726882894.05983: done queuing things up, now waiting for results queue to drain 33932 1726882894.05985: waiting for pending results... 33932 1726882894.06143: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33932 1726882894.06224: in run() - task 0e448fcc-3ce9-615b-5c48-000000000021 33932 1726882894.06235: variable 'ansible_search_path' from source: unknown 33932 1726882894.06238: variable 'ansible_search_path' from source: unknown 33932 1726882894.06273: calling self._execute() 33932 1726882894.06339: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.06343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.06350: variable 'omit' from source: magic vars 33932 1726882894.06621: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.06631: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882894.06717: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882894.06845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882894.08618: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882894.08662: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882894.08701: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882894.08725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882894.08744: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882894.08804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.08824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.08841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.08876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.08883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.08914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.08931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.08948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.08981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.08988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.09015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.09032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.09049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.09080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.09092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.09201: variable 'network_connections' from source: task vars 33932 1726882894.09207: variable 'interface' from source: play vars 33932 1726882894.09254: variable 'interface' from source: play vars 33932 1726882894.09266: variable 'vlan_interface' from source: play vars 33932 1726882894.09311: variable 'vlan_interface' from source: play vars 33932 1726882894.09317: variable 'interface' from source: play vars 33932 1726882894.09358: variable 'interface' from source: play vars 33932 1726882894.09407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882894.09516: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882894.09545: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882894.09576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882894.09600: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882894.09635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882894.09647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882894.09666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.09687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882894.09733: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882894.09880: variable 'network_connections' from source: task vars 33932 1726882894.09883: variable 'interface' from source: play vars 33932 1726882894.09925: variable 'interface' from source: play vars 33932 1726882894.09933: variable 'vlan_interface' from source: play vars 33932 1726882894.09979: variable 'vlan_interface' from source: play vars 33932 1726882894.09984: variable 'interface' from source: play vars 33932 1726882894.10025: variable 'interface' from source: play vars 33932 1726882894.10053: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 33932 1726882894.10075: when evaluation is False, skipping this task 33932 1726882894.10078: _execute() done 33932 1726882894.10080: dumping result to json 33932 1726882894.10082: done dumping result, returning 33932 1726882894.10084: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-000000000021] 33932 1726882894.10086: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000021 33932 1726882894.10156: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000021 33932 1726882894.10163: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 33932 1726882894.10211: no more pending results, returning what we have 33932 1726882894.10214: results queue empty 33932 1726882894.10215: checking for any_errors_fatal 33932 1726882894.10222: done checking for any_errors_fatal 33932 1726882894.10222: checking for max_fail_percentage 33932 1726882894.10224: done checking for max_fail_percentage 33932 1726882894.10224: checking to see if all hosts have failed and the running result is not ok 33932 1726882894.10225: done checking to see if all hosts have failed 33932 1726882894.10226: getting the remaining hosts for this loop 33932 1726882894.10227: done getting the remaining hosts for this loop 33932 1726882894.10231: getting the next task for host managed_node1 33932 1726882894.10237: done getting next task for host managed_node1 33932 1726882894.10240: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33932 1726882894.10242: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882894.10255: getting variables 33932 1726882894.10257: in VariableManager get_vars() 33932 1726882894.10305: Calling all_inventory to load vars for managed_node1 33932 1726882894.10307: Calling groups_inventory to load vars for managed_node1 33932 1726882894.10309: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882894.10317: Calling all_plugins_play to load vars for managed_node1 33932 1726882894.10320: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882894.10322: Calling groups_plugins_play to load vars for managed_node1 33932 1726882894.11247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882894.12192: done with get_vars() 33932 1726882894.12206: done getting variables 33932 1726882894.12248: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:34 -0400 (0:00:00.065) 0:00:14.590 ****** 33932 1726882894.12273: entering _queue_task() for managed_node1/service 33932 1726882894.12474: worker is 1 (out of 1 available) 33932 1726882894.12489: exiting _queue_task() for managed_node1/service 33932 1726882894.12501: done queuing things up, now waiting for results queue to drain 33932 1726882894.12502: waiting for pending results... 33932 1726882894.12663: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33932 1726882894.12754: in run() - task 0e448fcc-3ce9-615b-5c48-000000000022 33932 1726882894.12769: variable 'ansible_search_path' from source: unknown 33932 1726882894.12776: variable 'ansible_search_path' from source: unknown 33932 1726882894.12809: calling self._execute() 33932 1726882894.12877: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.12887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.12895: variable 'omit' from source: magic vars 33932 1726882894.13160: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.13172: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882894.13279: variable 'network_provider' from source: set_fact 33932 1726882894.13282: variable 'network_state' from source: role '' defaults 33932 1726882894.13291: Evaluated conditional (network_provider == "nm" or network_state != {}): True 33932 1726882894.13296: variable 'omit' from source: magic vars 33932 1726882894.13334: variable 'omit' from source: magic vars 33932 1726882894.13354: variable 'network_service_name' from source: role '' defaults 33932 1726882894.13407: variable 'network_service_name' from source: role '' defaults 33932 1726882894.13485: variable '__network_provider_setup' from source: role '' defaults 33932 1726882894.13488: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882894.13532: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882894.13544: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882894.13587: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882894.13730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882894.15231: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882894.15286: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882894.15312: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882894.15336: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882894.15355: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882894.15415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.15435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.15454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.15486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.15498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.15527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.15544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.15561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.15590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.15602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.15742: variable '__network_packages_default_gobject_packages' from source: role '' defaults 33932 1726882894.15819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.15834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.15851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.15878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.15890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.15952: variable 'ansible_python' from source: facts 33932 1726882894.15973: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 33932 1726882894.16026: variable '__network_wpa_supplicant_required' from source: role '' defaults 33932 1726882894.16082: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 33932 1726882894.16166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.16183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.16200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.16225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.16238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.16273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.16293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.16309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.16335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.16346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.16436: variable 'network_connections' from source: task vars 33932 1726882894.16443: variable 'interface' from source: play vars 33932 1726882894.16498: variable 'interface' from source: play vars 33932 1726882894.16509: variable 'vlan_interface' from source: play vars 33932 1726882894.16563: variable 'vlan_interface' from source: play vars 33932 1726882894.16576: variable 'interface' from source: play vars 33932 1726882894.16622: variable 'interface' from source: play vars 33932 1726882894.16695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882894.16823: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882894.16857: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882894.16890: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882894.16921: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882894.16961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882894.16986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882894.17011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.17036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882894.17074: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882894.17249: variable 'network_connections' from source: task vars 33932 1726882894.17255: variable 'interface' from source: play vars 33932 1726882894.17309: variable 'interface' from source: play vars 33932 1726882894.17322: variable 'vlan_interface' from source: play vars 33932 1726882894.17374: variable 'vlan_interface' from source: play vars 33932 1726882894.17381: variable 'interface' from source: play vars 33932 1726882894.17431: variable 'interface' from source: play vars 33932 1726882894.17470: variable '__network_packages_default_wireless' from source: role '' defaults 33932 1726882894.17522: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882894.17711: variable 'network_connections' from source: task vars 33932 1726882894.17714: variable 'interface' from source: play vars 33932 1726882894.17767: variable 'interface' from source: play vars 33932 1726882894.17777: variable 'vlan_interface' from source: play vars 33932 1726882894.17823: variable 'vlan_interface' from source: play vars 33932 1726882894.17828: variable 'interface' from source: play vars 33932 1726882894.17884: variable 'interface' from source: play vars 33932 1726882894.17901: variable '__network_packages_default_team' from source: role '' defaults 33932 1726882894.17953: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882894.18139: variable 'network_connections' from source: task vars 33932 1726882894.18143: variable 'interface' from source: play vars 33932 1726882894.18199: variable 'interface' from source: play vars 33932 1726882894.18202: variable 'vlan_interface' from source: play vars 33932 1726882894.18251: variable 'vlan_interface' from source: play vars 33932 1726882894.18256: variable 'interface' from source: play vars 33932 1726882894.18308: variable 'interface' from source: play vars 33932 1726882894.18353: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882894.18397: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882894.18404: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882894.18446: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882894.18583: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 33932 1726882894.19023: variable 'network_connections' from source: task vars 33932 1726882894.19027: variable 'interface' from source: play vars 33932 1726882894.19076: variable 'interface' from source: play vars 33932 1726882894.19081: variable 'vlan_interface' from source: play vars 33932 1726882894.19122: variable 'vlan_interface' from source: play vars 33932 1726882894.19127: variable 'interface' from source: play vars 33932 1726882894.19176: variable 'interface' from source: play vars 33932 1726882894.19179: variable 'ansible_distribution' from source: facts 33932 1726882894.19186: variable '__network_rh_distros' from source: role '' defaults 33932 1726882894.19189: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.19206: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 33932 1726882894.19319: variable 'ansible_distribution' from source: facts 33932 1726882894.19323: variable '__network_rh_distros' from source: role '' defaults 33932 1726882894.19326: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.19337: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 33932 1726882894.19450: variable 'ansible_distribution' from source: facts 33932 1726882894.19453: variable '__network_rh_distros' from source: role '' defaults 33932 1726882894.19456: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.19488: variable 'network_provider' from source: set_fact 33932 1726882894.19509: variable 'omit' from source: magic vars 33932 1726882894.19528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882894.19547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882894.19560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882894.19575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882894.19584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882894.19607: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882894.19610: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.19617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.19680: Set connection var ansible_shell_executable to /bin/sh 33932 1726882894.19687: Set connection var ansible_timeout to 10 33932 1726882894.19693: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882894.19696: Set connection var ansible_pipelining to False 33932 1726882894.19698: Set connection var ansible_connection to ssh 33932 1726882894.19705: Set connection var ansible_shell_type to sh 33932 1726882894.19725: variable 'ansible_shell_executable' from source: unknown 33932 1726882894.19728: variable 'ansible_connection' from source: unknown 33932 1726882894.19731: variable 'ansible_module_compression' from source: unknown 33932 1726882894.19733: variable 'ansible_shell_type' from source: unknown 33932 1726882894.19735: variable 'ansible_shell_executable' from source: unknown 33932 1726882894.19737: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.19740: variable 'ansible_pipelining' from source: unknown 33932 1726882894.19742: variable 'ansible_timeout' from source: unknown 33932 1726882894.19743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.19813: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882894.19817: variable 'omit' from source: magic vars 33932 1726882894.19824: starting attempt loop 33932 1726882894.19826: running the handler 33932 1726882894.19883: variable 'ansible_facts' from source: unknown 33932 1726882894.20338: _low_level_execute_command(): starting 33932 1726882894.20343: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882894.20852: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882894.20871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882894.20884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882894.20902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882894.20918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.20952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882894.20966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882894.21084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882894.22765: stdout chunk (state=3): >>>/root <<< 33932 1726882894.22870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882894.22916: stderr chunk (state=3): >>><<< 33932 1726882894.22923: stdout chunk (state=3): >>><<< 33932 1726882894.22942: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882894.22952: _low_level_execute_command(): starting 33932 1726882894.22957: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449 `" && echo ansible-tmp-1726882894.2294111-34668-264678000860449="` echo /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449 `" ) && sleep 0' 33932 1726882894.23396: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882894.23399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882894.23430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882894.23433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882894.23435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882894.23437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.23493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882894.23499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882894.23594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882894.25458: stdout chunk (state=3): >>>ansible-tmp-1726882894.2294111-34668-264678000860449=/root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449 <<< 33932 1726882894.25573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882894.25615: stderr chunk (state=3): >>><<< 33932 1726882894.25618: stdout chunk (state=3): >>><<< 33932 1726882894.25631: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882894.2294111-34668-264678000860449=/root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882894.25656: variable 'ansible_module_compression' from source: unknown 33932 1726882894.25700: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 33932 1726882894.25703: ANSIBALLZ: Acquiring lock 33932 1726882894.25706: ANSIBALLZ: Lock acquired: 140301144901104 33932 1726882894.25708: ANSIBALLZ: Creating module 33932 1726882894.44346: ANSIBALLZ: Writing module into payload 33932 1726882894.44478: ANSIBALLZ: Writing module 33932 1726882894.44504: ANSIBALLZ: Renaming module 33932 1726882894.44507: ANSIBALLZ: Done creating module 33932 1726882894.44536: variable 'ansible_facts' from source: unknown 33932 1726882894.44672: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449/AnsiballZ_systemd.py 33932 1726882894.44784: Sending initial data 33932 1726882894.44787: Sent initial data (156 bytes) 33932 1726882894.45509: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882894.45513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882894.45548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.45552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882894.45554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882894.45556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.45613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882894.45615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882894.45618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882894.45724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882894.47575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882894.47667: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882894.47760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp2xl31pvb /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449/AnsiballZ_systemd.py <<< 33932 1726882894.47856: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882894.49825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882894.49932: stderr chunk (state=3): >>><<< 33932 1726882894.49935: stdout chunk (state=3): >>><<< 33932 1726882894.49952: done transferring module to remote 33932 1726882894.49961: _low_level_execute_command(): starting 33932 1726882894.49966: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449/ /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449/AnsiballZ_systemd.py && sleep 0' 33932 1726882894.50423: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882894.50430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882894.50465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.50475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882894.50488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882894.50494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.50549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882894.50561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882894.50661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882894.52389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882894.52436: stderr chunk (state=3): >>><<< 33932 1726882894.52439: stdout chunk (state=3): >>><<< 33932 1726882894.52452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882894.52454: _low_level_execute_command(): starting 33932 1726882894.52459: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449/AnsiballZ_systemd.py && sleep 0' 33932 1726882894.52906: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882894.52919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882894.52944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.52960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.53013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882894.53018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882894.53138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882894.78389: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "72917", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ExecMainStartTimestampMonotonic": "1015349250", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "72917", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": <<< 33932 1726882894.78425: stdout chunk (state=3): >>>"system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5449", "MemoryCurrent": "5918720", "MemoryAvailable": "infinity", "CPUUsageNSec": "128374000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:57 EDT", "StateChangeTimestampMonotonic": "1015433030", "InactiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveExitTimestampMonotonic": "1015349539", "ActiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveEnterTimestampMonotonic": "1015433030", "ActiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveExitTimestampMonotonic": "1015317264", "InactiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveEnterTimestampMonotonic": "1015342641", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ConditionTimestampMonotonic": "1015343435", "AssertTimestamp": "Fri 2024-09-20 21:40:57 EDT", "AssertTimestampMonotonic": "1015343438", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d73a95d8f1ea4be78e350e6440c36a44", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 33932 1726882894.79958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882894.79961: stderr chunk (state=3): >>>Shared connection to 10.31.44.90 closed. <<< 33932 1726882894.80028: stderr chunk (state=3): >>><<< 33932 1726882894.80031: stdout chunk (state=3): >>><<< 33932 1726882894.80046: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "72917", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ExecMainStartTimestampMonotonic": "1015349250", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "72917", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5449", "MemoryCurrent": "5918720", "MemoryAvailable": "infinity", "CPUUsageNSec": "128374000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:57 EDT", "StateChangeTimestampMonotonic": "1015433030", "InactiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveExitTimestampMonotonic": "1015349539", "ActiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveEnterTimestampMonotonic": "1015433030", "ActiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveExitTimestampMonotonic": "1015317264", "InactiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveEnterTimestampMonotonic": "1015342641", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ConditionTimestampMonotonic": "1015343435", "AssertTimestamp": "Fri 2024-09-20 21:40:57 EDT", "AssertTimestampMonotonic": "1015343438", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d73a95d8f1ea4be78e350e6440c36a44", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882894.80161: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882894.80181: _low_level_execute_command(): starting 33932 1726882894.80185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882894.2294111-34668-264678000860449/ > /dev/null 2>&1 && sleep 0' 33932 1726882894.80637: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882894.80662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.80678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882894.80689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882894.80732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882894.80746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882894.80756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882894.80858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882894.82779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882894.82977: stderr chunk (state=3): >>><<< 33932 1726882894.82980: stdout chunk (state=3): >>><<< 33932 1726882894.82983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882894.82985: handler run complete 33932 1726882894.82987: attempt loop complete, returning result 33932 1726882894.82989: _execute() done 33932 1726882894.82991: dumping result to json 33932 1726882894.82993: done dumping result, returning 33932 1726882894.82995: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-615b-5c48-000000000022] 33932 1726882894.82997: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000022 33932 1726882894.83201: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000022 33932 1726882894.83203: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882894.83250: no more pending results, returning what we have 33932 1726882894.83253: results queue empty 33932 1726882894.83254: checking for any_errors_fatal 33932 1726882894.83260: done checking for any_errors_fatal 33932 1726882894.83260: checking for max_fail_percentage 33932 1726882894.83262: done checking for max_fail_percentage 33932 1726882894.83262: checking to see if all hosts have failed and the running result is not ok 33932 1726882894.83265: done checking to see if all hosts have failed 33932 1726882894.83266: getting the remaining hosts for this loop 33932 1726882894.83270: done getting the remaining hosts for this loop 33932 1726882894.83273: getting the next task for host managed_node1 33932 1726882894.83280: done getting next task for host managed_node1 33932 1726882894.83283: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33932 1726882894.83286: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882894.83296: getting variables 33932 1726882894.83297: in VariableManager get_vars() 33932 1726882894.83331: Calling all_inventory to load vars for managed_node1 33932 1726882894.83333: Calling groups_inventory to load vars for managed_node1 33932 1726882894.83336: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882894.83345: Calling all_plugins_play to load vars for managed_node1 33932 1726882894.83348: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882894.83350: Calling groups_plugins_play to load vars for managed_node1 33932 1726882894.84951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882894.86837: done with get_vars() 33932 1726882894.86861: done getting variables 33932 1726882894.86917: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:34 -0400 (0:00:00.746) 0:00:15.337 ****** 33932 1726882894.86955: entering _queue_task() for managed_node1/service 33932 1726882894.87321: worker is 1 (out of 1 available) 33932 1726882894.87335: exiting _queue_task() for managed_node1/service 33932 1726882894.87349: done queuing things up, now waiting for results queue to drain 33932 1726882894.87351: waiting for pending results... 33932 1726882894.87670: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33932 1726882894.87832: in run() - task 0e448fcc-3ce9-615b-5c48-000000000023 33932 1726882894.87852: variable 'ansible_search_path' from source: unknown 33932 1726882894.87861: variable 'ansible_search_path' from source: unknown 33932 1726882894.87919: calling self._execute() 33932 1726882894.88034: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.88049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.88063: variable 'omit' from source: magic vars 33932 1726882894.88501: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.88519: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882894.88648: variable 'network_provider' from source: set_fact 33932 1726882894.88659: Evaluated conditional (network_provider == "nm"): True 33932 1726882894.88765: variable '__network_wpa_supplicant_required' from source: role '' defaults 33932 1726882894.88871: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 33932 1726882894.89070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882894.91103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882894.91149: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882894.91183: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882894.91206: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882894.91227: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882894.91299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.91321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.91339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.91366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.91380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.91415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.91430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.91448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.91478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.91488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.91519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882894.91536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882894.91553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.91582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882894.91592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882894.91691: variable 'network_connections' from source: task vars 33932 1726882894.91701: variable 'interface' from source: play vars 33932 1726882894.91755: variable 'interface' from source: play vars 33932 1726882894.91768: variable 'vlan_interface' from source: play vars 33932 1726882894.91811: variable 'vlan_interface' from source: play vars 33932 1726882894.91817: variable 'interface' from source: play vars 33932 1726882894.91860: variable 'interface' from source: play vars 33932 1726882894.91913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882894.92026: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882894.92054: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882894.92084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882894.92115: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882894.92146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882894.92199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882894.92205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882894.92228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882894.92279: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882894.92520: variable 'network_connections' from source: task vars 33932 1726882894.92523: variable 'interface' from source: play vars 33932 1726882894.92585: variable 'interface' from source: play vars 33932 1726882894.92595: variable 'vlan_interface' from source: play vars 33932 1726882894.92654: variable 'vlan_interface' from source: play vars 33932 1726882894.92660: variable 'interface' from source: play vars 33932 1726882894.92722: variable 'interface' from source: play vars 33932 1726882894.92765: Evaluated conditional (__network_wpa_supplicant_required): False 33932 1726882894.92769: when evaluation is False, skipping this task 33932 1726882894.92774: _execute() done 33932 1726882894.92777: dumping result to json 33932 1726882894.92781: done dumping result, returning 33932 1726882894.92789: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-615b-5c48-000000000023] 33932 1726882894.92794: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000023 33932 1726882894.92890: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000023 33932 1726882894.92893: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 33932 1726882894.92935: no more pending results, returning what we have 33932 1726882894.92938: results queue empty 33932 1726882894.92939: checking for any_errors_fatal 33932 1726882894.92967: done checking for any_errors_fatal 33932 1726882894.92968: checking for max_fail_percentage 33932 1726882894.92969: done checking for max_fail_percentage 33932 1726882894.92970: checking to see if all hosts have failed and the running result is not ok 33932 1726882894.92971: done checking to see if all hosts have failed 33932 1726882894.92972: getting the remaining hosts for this loop 33932 1726882894.92973: done getting the remaining hosts for this loop 33932 1726882894.92978: getting the next task for host managed_node1 33932 1726882894.92984: done getting next task for host managed_node1 33932 1726882894.92988: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33932 1726882894.92991: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882894.93004: getting variables 33932 1726882894.93006: in VariableManager get_vars() 33932 1726882894.93043: Calling all_inventory to load vars for managed_node1 33932 1726882894.93045: Calling groups_inventory to load vars for managed_node1 33932 1726882894.93048: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882894.93057: Calling all_plugins_play to load vars for managed_node1 33932 1726882894.93059: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882894.93061: Calling groups_plugins_play to load vars for managed_node1 33932 1726882894.94188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882894.95128: done with get_vars() 33932 1726882894.95145: done getting variables 33932 1726882894.95190: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:34 -0400 (0:00:00.082) 0:00:15.419 ****** 33932 1726882894.95213: entering _queue_task() for managed_node1/service 33932 1726882894.95428: worker is 1 (out of 1 available) 33932 1726882894.95441: exiting _queue_task() for managed_node1/service 33932 1726882894.95452: done queuing things up, now waiting for results queue to drain 33932 1726882894.95455: waiting for pending results... 33932 1726882894.95643: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 33932 1726882894.95827: in run() - task 0e448fcc-3ce9-615b-5c48-000000000024 33932 1726882894.95831: variable 'ansible_search_path' from source: unknown 33932 1726882894.95834: variable 'ansible_search_path' from source: unknown 33932 1726882894.95838: calling self._execute() 33932 1726882894.96083: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.96087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.96090: variable 'omit' from source: magic vars 33932 1726882894.96322: variable 'ansible_distribution_major_version' from source: facts 33932 1726882894.96335: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882894.96468: variable 'network_provider' from source: set_fact 33932 1726882894.96477: Evaluated conditional (network_provider == "initscripts"): False 33932 1726882894.96480: when evaluation is False, skipping this task 33932 1726882894.96483: _execute() done 33932 1726882894.96486: dumping result to json 33932 1726882894.96489: done dumping result, returning 33932 1726882894.96497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-615b-5c48-000000000024] 33932 1726882894.96504: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000024 33932 1726882894.96590: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000024 33932 1726882894.96593: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882894.96635: no more pending results, returning what we have 33932 1726882894.96638: results queue empty 33932 1726882894.96639: checking for any_errors_fatal 33932 1726882894.96646: done checking for any_errors_fatal 33932 1726882894.96647: checking for max_fail_percentage 33932 1726882894.96649: done checking for max_fail_percentage 33932 1726882894.96649: checking to see if all hosts have failed and the running result is not ok 33932 1726882894.96650: done checking to see if all hosts have failed 33932 1726882894.96651: getting the remaining hosts for this loop 33932 1726882894.96653: done getting the remaining hosts for this loop 33932 1726882894.96656: getting the next task for host managed_node1 33932 1726882894.96661: done getting next task for host managed_node1 33932 1726882894.96670: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33932 1726882894.96675: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882894.96689: getting variables 33932 1726882894.96691: in VariableManager get_vars() 33932 1726882894.96723: Calling all_inventory to load vars for managed_node1 33932 1726882894.96725: Calling groups_inventory to load vars for managed_node1 33932 1726882894.96727: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882894.96736: Calling all_plugins_play to load vars for managed_node1 33932 1726882894.96738: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882894.96740: Calling groups_plugins_play to load vars for managed_node1 33932 1726882894.98177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882894.99121: done with get_vars() 33932 1726882894.99136: done getting variables 33932 1726882894.99181: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:34 -0400 (0:00:00.039) 0:00:15.459 ****** 33932 1726882894.99206: entering _queue_task() for managed_node1/copy 33932 1726882894.99414: worker is 1 (out of 1 available) 33932 1726882894.99428: exiting _queue_task() for managed_node1/copy 33932 1726882894.99440: done queuing things up, now waiting for results queue to drain 33932 1726882894.99442: waiting for pending results... 33932 1726882894.99621: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33932 1726882894.99712: in run() - task 0e448fcc-3ce9-615b-5c48-000000000025 33932 1726882894.99750: variable 'ansible_search_path' from source: unknown 33932 1726882894.99756: variable 'ansible_search_path' from source: unknown 33932 1726882894.99797: calling self._execute() 33932 1726882894.99895: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882894.99899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882894.99907: variable 'omit' from source: magic vars 33932 1726882895.00274: variable 'ansible_distribution_major_version' from source: facts 33932 1726882895.00302: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882895.00428: variable 'network_provider' from source: set_fact 33932 1726882895.00438: Evaluated conditional (network_provider == "initscripts"): False 33932 1726882895.00445: when evaluation is False, skipping this task 33932 1726882895.00451: _execute() done 33932 1726882895.00458: dumping result to json 33932 1726882895.00467: done dumping result, returning 33932 1726882895.00480: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-615b-5c48-000000000025] 33932 1726882895.00494: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000025 33932 1726882895.00609: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000025 33932 1726882895.00620: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 33932 1726882895.00893: no more pending results, returning what we have 33932 1726882895.00896: results queue empty 33932 1726882895.00897: checking for any_errors_fatal 33932 1726882895.00902: done checking for any_errors_fatal 33932 1726882895.00903: checking for max_fail_percentage 33932 1726882895.00905: done checking for max_fail_percentage 33932 1726882895.00908: checking to see if all hosts have failed and the running result is not ok 33932 1726882895.00909: done checking to see if all hosts have failed 33932 1726882895.00910: getting the remaining hosts for this loop 33932 1726882895.00911: done getting the remaining hosts for this loop 33932 1726882895.00915: getting the next task for host managed_node1 33932 1726882895.00921: done getting next task for host managed_node1 33932 1726882895.00925: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33932 1726882895.00928: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882895.00949: getting variables 33932 1726882895.00951: in VariableManager get_vars() 33932 1726882895.00992: Calling all_inventory to load vars for managed_node1 33932 1726882895.00995: Calling groups_inventory to load vars for managed_node1 33932 1726882895.00997: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882895.01006: Calling all_plugins_play to load vars for managed_node1 33932 1726882895.01008: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882895.01014: Calling groups_plugins_play to load vars for managed_node1 33932 1726882895.02335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882895.04057: done with get_vars() 33932 1726882895.04076: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:35 -0400 (0:00:00.049) 0:00:15.508 ****** 33932 1726882895.04134: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33932 1726882895.04135: Creating lock for fedora.linux_system_roles.network_connections 33932 1726882895.04343: worker is 1 (out of 1 available) 33932 1726882895.04358: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33932 1726882895.04374: done queuing things up, now waiting for results queue to drain 33932 1726882895.04376: waiting for pending results... 33932 1726882895.04538: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33932 1726882895.04618: in run() - task 0e448fcc-3ce9-615b-5c48-000000000026 33932 1726882895.04630: variable 'ansible_search_path' from source: unknown 33932 1726882895.04634: variable 'ansible_search_path' from source: unknown 33932 1726882895.04665: calling self._execute() 33932 1726882895.04736: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.04741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.04749: variable 'omit' from source: magic vars 33932 1726882895.05023: variable 'ansible_distribution_major_version' from source: facts 33932 1726882895.05034: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882895.05045: variable 'omit' from source: magic vars 33932 1726882895.05083: variable 'omit' from source: magic vars 33932 1726882895.05280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882895.08595: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882895.08640: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882895.08670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882895.08696: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882895.08716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882895.08777: variable 'network_provider' from source: set_fact 33932 1726882895.08875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882895.08904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882895.08922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882895.08948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882895.08963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882895.09015: variable 'omit' from source: magic vars 33932 1726882895.09094: variable 'omit' from source: magic vars 33932 1726882895.09161: variable 'network_connections' from source: task vars 33932 1726882895.09173: variable 'interface' from source: play vars 33932 1726882895.09221: variable 'interface' from source: play vars 33932 1726882895.09231: variable 'vlan_interface' from source: play vars 33932 1726882895.09274: variable 'vlan_interface' from source: play vars 33932 1726882895.09279: variable 'interface' from source: play vars 33932 1726882895.09323: variable 'interface' from source: play vars 33932 1726882895.09448: variable 'omit' from source: magic vars 33932 1726882895.09455: variable '__lsr_ansible_managed' from source: task vars 33932 1726882895.09501: variable '__lsr_ansible_managed' from source: task vars 33932 1726882895.09688: Loaded config def from plugin (lookup/template) 33932 1726882895.09692: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 33932 1726882895.09713: File lookup term: get_ansible_managed.j2 33932 1726882895.09716: variable 'ansible_search_path' from source: unknown 33932 1726882895.09721: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 33932 1726882895.09733: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 33932 1726882895.09748: variable 'ansible_search_path' from source: unknown 33932 1726882895.13171: variable 'ansible_managed' from source: unknown 33932 1726882895.13244: variable 'omit' from source: magic vars 33932 1726882895.13266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882895.13288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882895.13301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882895.13316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882895.13325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882895.13346: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882895.13350: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.13352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.13418: Set connection var ansible_shell_executable to /bin/sh 33932 1726882895.13425: Set connection var ansible_timeout to 10 33932 1726882895.13430: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882895.13435: Set connection var ansible_pipelining to False 33932 1726882895.13438: Set connection var ansible_connection to ssh 33932 1726882895.13440: Set connection var ansible_shell_type to sh 33932 1726882895.13457: variable 'ansible_shell_executable' from source: unknown 33932 1726882895.13460: variable 'ansible_connection' from source: unknown 33932 1726882895.13465: variable 'ansible_module_compression' from source: unknown 33932 1726882895.13467: variable 'ansible_shell_type' from source: unknown 33932 1726882895.13472: variable 'ansible_shell_executable' from source: unknown 33932 1726882895.13475: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.13477: variable 'ansible_pipelining' from source: unknown 33932 1726882895.13480: variable 'ansible_timeout' from source: unknown 33932 1726882895.13482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.13574: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882895.13581: variable 'omit' from source: magic vars 33932 1726882895.13588: starting attempt loop 33932 1726882895.13591: running the handler 33932 1726882895.13600: _low_level_execute_command(): starting 33932 1726882895.13607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882895.14132: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882895.14147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.14172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.14191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.14235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882895.14240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882895.14254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882895.14363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882895.16029: stdout chunk (state=3): >>>/root <<< 33932 1726882895.16131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882895.16183: stderr chunk (state=3): >>><<< 33932 1726882895.16188: stdout chunk (state=3): >>><<< 33932 1726882895.16205: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882895.16218: _low_level_execute_command(): starting 33932 1726882895.16226: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711 `" && echo ansible-tmp-1726882895.1620805-34686-21009115953711="` echo /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711 `" ) && sleep 0' 33932 1726882895.16666: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882895.16673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.16703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.16715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.16778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882895.16789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882895.16890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882895.18740: stdout chunk (state=3): >>>ansible-tmp-1726882895.1620805-34686-21009115953711=/root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711 <<< 33932 1726882895.18853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882895.18902: stderr chunk (state=3): >>><<< 33932 1726882895.18905: stdout chunk (state=3): >>><<< 33932 1726882895.18921: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882895.1620805-34686-21009115953711=/root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882895.18954: variable 'ansible_module_compression' from source: unknown 33932 1726882895.18993: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 33932 1726882895.18998: ANSIBALLZ: Acquiring lock 33932 1726882895.19001: ANSIBALLZ: Lock acquired: 140301136736624 33932 1726882895.19003: ANSIBALLZ: Creating module 33932 1726882895.32016: ANSIBALLZ: Writing module into payload 33932 1726882895.32361: ANSIBALLZ: Writing module 33932 1726882895.32386: ANSIBALLZ: Renaming module 33932 1726882895.32389: ANSIBALLZ: Done creating module 33932 1726882895.32411: variable 'ansible_facts' from source: unknown 33932 1726882895.32478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711/AnsiballZ_network_connections.py 33932 1726882895.32584: Sending initial data 33932 1726882895.32598: Sent initial data (167 bytes) 33932 1726882895.33286: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882895.33293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.33328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.33334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882895.33342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882895.33348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882895.33357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.33362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.33434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882895.33437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882895.33547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882895.35384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882895.35480: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882895.35573: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpcxi_y4q4 /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711/AnsiballZ_network_connections.py <<< 33932 1726882895.35666: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882895.37019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882895.37120: stderr chunk (state=3): >>><<< 33932 1726882895.37123: stdout chunk (state=3): >>><<< 33932 1726882895.37140: done transferring module to remote 33932 1726882895.37149: _low_level_execute_command(): starting 33932 1726882895.37154: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711/ /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711/AnsiballZ_network_connections.py && sleep 0' 33932 1726882895.37595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882895.37601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.37652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882895.37655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.37658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882895.37660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.37710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882895.37718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882895.37826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882895.39558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882895.39611: stderr chunk (state=3): >>><<< 33932 1726882895.39614: stdout chunk (state=3): >>><<< 33932 1726882895.39627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882895.39630: _low_level_execute_command(): starting 33932 1726882895.39635: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711/AnsiballZ_network_connections.py && sleep 0' 33932 1726882895.40081: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882895.40085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.40123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.40126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882895.40128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.40181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882895.40190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882895.40287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882895.70272: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 33932 1726882895.73091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882895.73184: stderr chunk (state=3): >>><<< 33932 1726882895.73188: stdout chunk (state=3): >>><<< 33932 1726882895.73278: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882895.73282: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'type': 'ethernet', 'state': 'up', 'mtu': 1492, 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}, {'name': 'lsr101.90', 'parent': 'lsr101', 'type': 'vlan', 'vlan_id': 90, 'mtu': 1280, 'state': 'up', 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882895.73372: _low_level_execute_command(): starting 33932 1726882895.73376: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882895.1620805-34686-21009115953711/ > /dev/null 2>&1 && sleep 0' 33932 1726882895.74948: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882895.75027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882895.75041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882895.75057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.75104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882895.75116: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882895.75135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.75214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882895.75230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882895.75246: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882895.75257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882895.75274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882895.75288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882895.75297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882895.75306: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882895.75316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882895.75394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882895.75414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882895.75428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882895.75558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882895.77493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882895.77496: stdout chunk (state=3): >>><<< 33932 1726882895.77499: stderr chunk (state=3): >>><<< 33932 1726882895.77572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882895.77575: handler run complete 33932 1726882895.77776: attempt loop complete, returning result 33932 1726882895.77779: _execute() done 33932 1726882895.77782: dumping result to json 33932 1726882895.77784: done dumping result, returning 33932 1726882895.77786: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-615b-5c48-000000000026] 33932 1726882895.77788: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000026 33932 1726882895.77880: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000026 33932 1726882895.77883: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3 [006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1 [007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3 (not-active) [008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1 (not-active) 33932 1726882895.78029: no more pending results, returning what we have 33932 1726882895.78033: results queue empty 33932 1726882895.78034: checking for any_errors_fatal 33932 1726882895.78043: done checking for any_errors_fatal 33932 1726882895.78043: checking for max_fail_percentage 33932 1726882895.78046: done checking for max_fail_percentage 33932 1726882895.78047: checking to see if all hosts have failed and the running result is not ok 33932 1726882895.78048: done checking to see if all hosts have failed 33932 1726882895.78048: getting the remaining hosts for this loop 33932 1726882895.78050: done getting the remaining hosts for this loop 33932 1726882895.78054: getting the next task for host managed_node1 33932 1726882895.78061: done getting next task for host managed_node1 33932 1726882895.78066: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33932 1726882895.78072: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882895.78084: getting variables 33932 1726882895.78086: in VariableManager get_vars() 33932 1726882895.78125: Calling all_inventory to load vars for managed_node1 33932 1726882895.78128: Calling groups_inventory to load vars for managed_node1 33932 1726882895.78131: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882895.78141: Calling all_plugins_play to load vars for managed_node1 33932 1726882895.78144: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882895.78147: Calling groups_plugins_play to load vars for managed_node1 33932 1726882895.80898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882895.82663: done with get_vars() 33932 1726882895.82692: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:35 -0400 (0:00:00.786) 0:00:16.295 ****** 33932 1726882895.82782: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33932 1726882895.82784: Creating lock for fedora.linux_system_roles.network_state 33932 1726882895.83283: worker is 1 (out of 1 available) 33932 1726882895.83298: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33932 1726882895.83310: done queuing things up, now waiting for results queue to drain 33932 1726882895.83312: waiting for pending results... 33932 1726882895.84074: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 33932 1726882895.84229: in run() - task 0e448fcc-3ce9-615b-5c48-000000000027 33932 1726882895.84249: variable 'ansible_search_path' from source: unknown 33932 1726882895.84256: variable 'ansible_search_path' from source: unknown 33932 1726882895.84302: calling self._execute() 33932 1726882895.84405: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.84418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.84437: variable 'omit' from source: magic vars 33932 1726882895.84822: variable 'ansible_distribution_major_version' from source: facts 33932 1726882895.84840: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882895.84972: variable 'network_state' from source: role '' defaults 33932 1726882895.84994: Evaluated conditional (network_state != {}): False 33932 1726882895.85002: when evaluation is False, skipping this task 33932 1726882895.85009: _execute() done 33932 1726882895.85016: dumping result to json 33932 1726882895.85024: done dumping result, returning 33932 1726882895.85033: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-615b-5c48-000000000027] 33932 1726882895.85043: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000027 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882895.85196: no more pending results, returning what we have 33932 1726882895.85200: results queue empty 33932 1726882895.85201: checking for any_errors_fatal 33932 1726882895.85213: done checking for any_errors_fatal 33932 1726882895.85213: checking for max_fail_percentage 33932 1726882895.85215: done checking for max_fail_percentage 33932 1726882895.85215: checking to see if all hosts have failed and the running result is not ok 33932 1726882895.85216: done checking to see if all hosts have failed 33932 1726882895.85217: getting the remaining hosts for this loop 33932 1726882895.85219: done getting the remaining hosts for this loop 33932 1726882895.85222: getting the next task for host managed_node1 33932 1726882895.85229: done getting next task for host managed_node1 33932 1726882895.85232: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33932 1726882895.85236: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882895.85251: getting variables 33932 1726882895.85253: in VariableManager get_vars() 33932 1726882895.85300: Calling all_inventory to load vars for managed_node1 33932 1726882895.85302: Calling groups_inventory to load vars for managed_node1 33932 1726882895.85305: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882895.85317: Calling all_plugins_play to load vars for managed_node1 33932 1726882895.85319: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882895.85322: Calling groups_plugins_play to load vars for managed_node1 33932 1726882895.86330: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000027 33932 1726882895.86333: WORKER PROCESS EXITING 33932 1726882895.88404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882895.90232: done with get_vars() 33932 1726882895.90256: done getting variables 33932 1726882895.90323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:35 -0400 (0:00:00.075) 0:00:16.371 ****** 33932 1726882895.90355: entering _queue_task() for managed_node1/debug 33932 1726882895.90675: worker is 1 (out of 1 available) 33932 1726882895.90688: exiting _queue_task() for managed_node1/debug 33932 1726882895.90699: done queuing things up, now waiting for results queue to drain 33932 1726882895.90701: waiting for pending results... 33932 1726882895.90997: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33932 1726882895.91162: in run() - task 0e448fcc-3ce9-615b-5c48-000000000028 33932 1726882895.91194: variable 'ansible_search_path' from source: unknown 33932 1726882895.91208: variable 'ansible_search_path' from source: unknown 33932 1726882895.91332: calling self._execute() 33932 1726882895.91432: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.91444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.91457: variable 'omit' from source: magic vars 33932 1726882895.91873: variable 'ansible_distribution_major_version' from source: facts 33932 1726882895.91893: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882895.91904: variable 'omit' from source: magic vars 33932 1726882895.91971: variable 'omit' from source: magic vars 33932 1726882895.92011: variable 'omit' from source: magic vars 33932 1726882895.92052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882895.92095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882895.92119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882895.92142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882895.92158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882895.92202: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882895.92210: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.92217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.92330: Set connection var ansible_shell_executable to /bin/sh 33932 1726882895.92343: Set connection var ansible_timeout to 10 33932 1726882895.92352: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882895.92361: Set connection var ansible_pipelining to False 33932 1726882895.92374: Set connection var ansible_connection to ssh 33932 1726882895.92381: Set connection var ansible_shell_type to sh 33932 1726882895.92413: variable 'ansible_shell_executable' from source: unknown 33932 1726882895.92421: variable 'ansible_connection' from source: unknown 33932 1726882895.92428: variable 'ansible_module_compression' from source: unknown 33932 1726882895.92435: variable 'ansible_shell_type' from source: unknown 33932 1726882895.92441: variable 'ansible_shell_executable' from source: unknown 33932 1726882895.92448: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.92454: variable 'ansible_pipelining' from source: unknown 33932 1726882895.92460: variable 'ansible_timeout' from source: unknown 33932 1726882895.92472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.92633: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882895.92648: variable 'omit' from source: magic vars 33932 1726882895.92657: starting attempt loop 33932 1726882895.92665: running the handler 33932 1726882895.92861: variable '__network_connections_result' from source: set_fact 33932 1726882895.93045: handler run complete 33932 1726882895.93078: attempt loop complete, returning result 33932 1726882895.93085: _execute() done 33932 1726882895.93093: dumping result to json 33932 1726882895.93101: done dumping result, returning 33932 1726882895.93114: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-615b-5c48-000000000028] 33932 1726882895.93165: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000028 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3 (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1 (not-active)" ] } 33932 1726882895.93337: no more pending results, returning what we have 33932 1726882895.93340: results queue empty 33932 1726882895.93341: checking for any_errors_fatal 33932 1726882895.93348: done checking for any_errors_fatal 33932 1726882895.93349: checking for max_fail_percentage 33932 1726882895.93350: done checking for max_fail_percentage 33932 1726882895.93351: checking to see if all hosts have failed and the running result is not ok 33932 1726882895.93352: done checking to see if all hosts have failed 33932 1726882895.93353: getting the remaining hosts for this loop 33932 1726882895.93355: done getting the remaining hosts for this loop 33932 1726882895.93358: getting the next task for host managed_node1 33932 1726882895.93370: done getting next task for host managed_node1 33932 1726882895.93374: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33932 1726882895.93379: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882895.93392: getting variables 33932 1726882895.93394: in VariableManager get_vars() 33932 1726882895.93436: Calling all_inventory to load vars for managed_node1 33932 1726882895.93439: Calling groups_inventory to load vars for managed_node1 33932 1726882895.93441: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882895.93453: Calling all_plugins_play to load vars for managed_node1 33932 1726882895.93456: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882895.93459: Calling groups_plugins_play to load vars for managed_node1 33932 1726882895.94773: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000028 33932 1726882895.94776: WORKER PROCESS EXITING 33932 1726882895.96452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882895.98639: done with get_vars() 33932 1726882895.98662: done getting variables 33932 1726882895.98728: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:35 -0400 (0:00:00.084) 0:00:16.455 ****** 33932 1726882895.98770: entering _queue_task() for managed_node1/debug 33932 1726882895.99088: worker is 1 (out of 1 available) 33932 1726882895.99103: exiting _queue_task() for managed_node1/debug 33932 1726882895.99114: done queuing things up, now waiting for results queue to drain 33932 1726882895.99115: waiting for pending results... 33932 1726882895.99393: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33932 1726882895.99530: in run() - task 0e448fcc-3ce9-615b-5c48-000000000029 33932 1726882895.99553: variable 'ansible_search_path' from source: unknown 33932 1726882895.99566: variable 'ansible_search_path' from source: unknown 33932 1726882895.99610: calling self._execute() 33932 1726882895.99711: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882895.99722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882895.99735: variable 'omit' from source: magic vars 33932 1726882896.00123: variable 'ansible_distribution_major_version' from source: facts 33932 1726882896.00142: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882896.00153: variable 'omit' from source: magic vars 33932 1726882896.00222: variable 'omit' from source: magic vars 33932 1726882896.00262: variable 'omit' from source: magic vars 33932 1726882896.00310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882896.00351: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882896.00381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882896.00403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882896.00419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882896.00456: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882896.00467: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.00477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.00584: Set connection var ansible_shell_executable to /bin/sh 33932 1726882896.00598: Set connection var ansible_timeout to 10 33932 1726882896.00607: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882896.00616: Set connection var ansible_pipelining to False 33932 1726882896.00622: Set connection var ansible_connection to ssh 33932 1726882896.00627: Set connection var ansible_shell_type to sh 33932 1726882896.00658: variable 'ansible_shell_executable' from source: unknown 33932 1726882896.00667: variable 'ansible_connection' from source: unknown 33932 1726882896.00679: variable 'ansible_module_compression' from source: unknown 33932 1726882896.00685: variable 'ansible_shell_type' from source: unknown 33932 1726882896.00691: variable 'ansible_shell_executable' from source: unknown 33932 1726882896.00697: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.00703: variable 'ansible_pipelining' from source: unknown 33932 1726882896.00708: variable 'ansible_timeout' from source: unknown 33932 1726882896.00715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.00857: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882896.00881: variable 'omit' from source: magic vars 33932 1726882896.00892: starting attempt loop 33932 1726882896.00899: running the handler 33932 1726882896.00949: variable '__network_connections_result' from source: set_fact 33932 1726882896.01031: variable '__network_connections_result' from source: set_fact 33932 1726882896.01210: handler run complete 33932 1726882896.01250: attempt loop complete, returning result 33932 1726882896.01257: _execute() done 33932 1726882896.01265: dumping result to json 33932 1726882896.01278: done dumping result, returning 33932 1726882896.01290: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-615b-5c48-000000000029] 33932 1726882896.01306: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000029 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3 (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1 (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, bf4b0bae-03ff-4dc6-a59c-c7d19007aec3 (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6b10c778-6398-4428-94af-1aa693ccf4b1 (not-active)" ] } } 33932 1726882896.01516: no more pending results, returning what we have 33932 1726882896.01520: results queue empty 33932 1726882896.01520: checking for any_errors_fatal 33932 1726882896.01526: done checking for any_errors_fatal 33932 1726882896.01527: checking for max_fail_percentage 33932 1726882896.01529: done checking for max_fail_percentage 33932 1726882896.01529: checking to see if all hosts have failed and the running result is not ok 33932 1726882896.01530: done checking to see if all hosts have failed 33932 1726882896.01531: getting the remaining hosts for this loop 33932 1726882896.01533: done getting the remaining hosts for this loop 33932 1726882896.01536: getting the next task for host managed_node1 33932 1726882896.01543: done getting next task for host managed_node1 33932 1726882896.01547: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33932 1726882896.01550: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882896.01571: getting variables 33932 1726882896.01574: in VariableManager get_vars() 33932 1726882896.01613: Calling all_inventory to load vars for managed_node1 33932 1726882896.01616: Calling groups_inventory to load vars for managed_node1 33932 1726882896.01618: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.01629: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.01632: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.01634: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.02657: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000029 33932 1726882896.02660: WORKER PROCESS EXITING 33932 1726882896.03378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882896.05332: done with get_vars() 33932 1726882896.05357: done getting variables 33932 1726882896.05662: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:36 -0400 (0:00:00.069) 0:00:16.524 ****** 33932 1726882896.05701: entering _queue_task() for managed_node1/debug 33932 1726882896.06331: worker is 1 (out of 1 available) 33932 1726882896.06344: exiting _queue_task() for managed_node1/debug 33932 1726882896.06354: done queuing things up, now waiting for results queue to drain 33932 1726882896.06356: waiting for pending results... 33932 1726882896.07266: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33932 1726882896.07623: in run() - task 0e448fcc-3ce9-615b-5c48-00000000002a 33932 1726882896.07647: variable 'ansible_search_path' from source: unknown 33932 1726882896.07651: variable 'ansible_search_path' from source: unknown 33932 1726882896.07692: calling self._execute() 33932 1726882896.07901: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.07906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.07915: variable 'omit' from source: magic vars 33932 1726882896.08848: variable 'ansible_distribution_major_version' from source: facts 33932 1726882896.08860: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882896.09100: variable 'network_state' from source: role '' defaults 33932 1726882896.09111: Evaluated conditional (network_state != {}): False 33932 1726882896.09114: when evaluation is False, skipping this task 33932 1726882896.09117: _execute() done 33932 1726882896.09120: dumping result to json 33932 1726882896.09123: done dumping result, returning 33932 1726882896.09130: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-615b-5c48-00000000002a] 33932 1726882896.09136: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000002a 33932 1726882896.09439: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000002a 33932 1726882896.09442: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 33932 1726882896.09495: no more pending results, returning what we have 33932 1726882896.09499: results queue empty 33932 1726882896.09499: checking for any_errors_fatal 33932 1726882896.09511: done checking for any_errors_fatal 33932 1726882896.09512: checking for max_fail_percentage 33932 1726882896.09513: done checking for max_fail_percentage 33932 1726882896.09514: checking to see if all hosts have failed and the running result is not ok 33932 1726882896.09515: done checking to see if all hosts have failed 33932 1726882896.09516: getting the remaining hosts for this loop 33932 1726882896.09518: done getting the remaining hosts for this loop 33932 1726882896.09522: getting the next task for host managed_node1 33932 1726882896.09528: done getting next task for host managed_node1 33932 1726882896.09532: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33932 1726882896.09536: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882896.09552: getting variables 33932 1726882896.09554: in VariableManager get_vars() 33932 1726882896.09603: Calling all_inventory to load vars for managed_node1 33932 1726882896.09606: Calling groups_inventory to load vars for managed_node1 33932 1726882896.09609: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.09623: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.09627: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.09630: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.13057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882896.14958: done with get_vars() 33932 1726882896.14988: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:36 -0400 (0:00:00.093) 0:00:16.618 ****** 33932 1726882896.15090: entering _queue_task() for managed_node1/ping 33932 1726882896.15092: Creating lock for ping 33932 1726882896.15762: worker is 1 (out of 1 available) 33932 1726882896.15781: exiting _queue_task() for managed_node1/ping 33932 1726882896.15792: done queuing things up, now waiting for results queue to drain 33932 1726882896.15793: waiting for pending results... 33932 1726882896.16458: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 33932 1726882896.16615: in run() - task 0e448fcc-3ce9-615b-5c48-00000000002b 33932 1726882896.16635: variable 'ansible_search_path' from source: unknown 33932 1726882896.16641: variable 'ansible_search_path' from source: unknown 33932 1726882896.16688: calling self._execute() 33932 1726882896.16807: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.16909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.16923: variable 'omit' from source: magic vars 33932 1726882896.17300: variable 'ansible_distribution_major_version' from source: facts 33932 1726882896.17317: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882896.17328: variable 'omit' from source: magic vars 33932 1726882896.17420: variable 'omit' from source: magic vars 33932 1726882896.17458: variable 'omit' from source: magic vars 33932 1726882896.17508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882896.17545: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882896.17573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882896.17598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882896.17614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882896.17645: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882896.17654: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.17662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.17778: Set connection var ansible_shell_executable to /bin/sh 33932 1726882896.17792: Set connection var ansible_timeout to 10 33932 1726882896.17803: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882896.17812: Set connection var ansible_pipelining to False 33932 1726882896.17817: Set connection var ansible_connection to ssh 33932 1726882896.17823: Set connection var ansible_shell_type to sh 33932 1726882896.17848: variable 'ansible_shell_executable' from source: unknown 33932 1726882896.17854: variable 'ansible_connection' from source: unknown 33932 1726882896.17861: variable 'ansible_module_compression' from source: unknown 33932 1726882896.17872: variable 'ansible_shell_type' from source: unknown 33932 1726882896.17880: variable 'ansible_shell_executable' from source: unknown 33932 1726882896.17887: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.17894: variable 'ansible_pipelining' from source: unknown 33932 1726882896.17902: variable 'ansible_timeout' from source: unknown 33932 1726882896.17911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.18111: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882896.18131: variable 'omit' from source: magic vars 33932 1726882896.18139: starting attempt loop 33932 1726882896.18145: running the handler 33932 1726882896.18159: _low_level_execute_command(): starting 33932 1726882896.18176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882896.18916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882896.18934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.18950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.18972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.19023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.19036: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882896.19052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.19078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882896.19093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882896.19110: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882896.19124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.19138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.19153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.19170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.19186: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882896.19204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.19288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882896.19311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882896.19332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882896.19472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882896.21142: stdout chunk (state=3): >>>/root <<< 33932 1726882896.21345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882896.21349: stdout chunk (state=3): >>><<< 33932 1726882896.21351: stderr chunk (state=3): >>><<< 33932 1726882896.21480: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882896.21484: _low_level_execute_command(): starting 33932 1726882896.21488: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876 `" && echo ansible-tmp-1726882896.2137747-34726-243816969274876="` echo /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876 `" ) && sleep 0' 33932 1726882896.22137: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882896.22153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.22176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.22197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.22243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.22261: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882896.22282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.22301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882896.22313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882896.22324: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882896.22336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.22350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.22379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.22391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.22402: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882896.22416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.22503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882896.22526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882896.22543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882896.22680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882896.24529: stdout chunk (state=3): >>>ansible-tmp-1726882896.2137747-34726-243816969274876=/root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876 <<< 33932 1726882896.24711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882896.24715: stdout chunk (state=3): >>><<< 33932 1726882896.24717: stderr chunk (state=3): >>><<< 33932 1726882896.25171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882896.2137747-34726-243816969274876=/root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882896.25175: variable 'ansible_module_compression' from source: unknown 33932 1726882896.25177: ANSIBALLZ: Using lock for ping 33932 1726882896.25179: ANSIBALLZ: Acquiring lock 33932 1726882896.25180: ANSIBALLZ: Lock acquired: 140301136641680 33932 1726882896.25182: ANSIBALLZ: Creating module 33932 1726882896.39340: ANSIBALLZ: Writing module into payload 33932 1726882896.39412: ANSIBALLZ: Writing module 33932 1726882896.39440: ANSIBALLZ: Renaming module 33932 1726882896.39450: ANSIBALLZ: Done creating module 33932 1726882896.39476: variable 'ansible_facts' from source: unknown 33932 1726882896.39539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876/AnsiballZ_ping.py 33932 1726882896.39711: Sending initial data 33932 1726882896.39715: Sent initial data (153 bytes) 33932 1726882896.40723: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882896.40738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.40752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.40775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.40817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.40828: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882896.40842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.40859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882896.40875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882896.40886: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882896.40897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.40909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.40923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.40933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.40943: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882896.40956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.41037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882896.41053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882896.41075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882896.41205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882896.43051: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882896.43145: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882896.43244: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp3cx340fp /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876/AnsiballZ_ping.py <<< 33932 1726882896.43336: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882896.44533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882896.44807: stderr chunk (state=3): >>><<< 33932 1726882896.44810: stdout chunk (state=3): >>><<< 33932 1726882896.44812: done transferring module to remote 33932 1726882896.44815: _low_level_execute_command(): starting 33932 1726882896.44817: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876/ /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876/AnsiballZ_ping.py && sleep 0' 33932 1726882896.45417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882896.45429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.45444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.45460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.45506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.45516: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882896.45528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.45545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882896.45557: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882896.45573: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882896.45588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.45599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.45612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.45622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.45631: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882896.45641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.45722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882896.45738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882896.45751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882896.45882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882896.47684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882896.47688: stdout chunk (state=3): >>><<< 33932 1726882896.47695: stderr chunk (state=3): >>><<< 33932 1726882896.47717: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882896.47720: _low_level_execute_command(): starting 33932 1726882896.47725: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876/AnsiballZ_ping.py && sleep 0' 33932 1726882896.48391: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882896.48399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.48409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.48422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.48458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.48480: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882896.48490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.48503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882896.48510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882896.48516: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882896.48524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.48533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.48543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.48550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.48557: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882896.48567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.48646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882896.48662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882896.48676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882896.48819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882896.61979: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 33932 1726882896.63031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882896.63035: stdout chunk (state=3): >>><<< 33932 1726882896.63041: stderr chunk (state=3): >>><<< 33932 1726882896.63060: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882896.63087: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882896.63096: _low_level_execute_command(): starting 33932 1726882896.63101: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882896.2137747-34726-243816969274876/ > /dev/null 2>&1 && sleep 0' 33932 1726882896.63727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882896.63735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.63744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.63758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.63797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.63803: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882896.63813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.63825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882896.63832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882896.63838: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882896.63846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882896.63854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882896.63867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882896.63880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882896.63886: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882896.63895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882896.63960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882896.63978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882896.63984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882896.64274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882896.66015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882896.66087: stderr chunk (state=3): >>><<< 33932 1726882896.66090: stdout chunk (state=3): >>><<< 33932 1726882896.66109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882896.66112: handler run complete 33932 1726882896.66129: attempt loop complete, returning result 33932 1726882896.66132: _execute() done 33932 1726882896.66134: dumping result to json 33932 1726882896.66136: done dumping result, returning 33932 1726882896.66147: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-615b-5c48-00000000002b] 33932 1726882896.66154: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000002b 33932 1726882896.66249: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000002b 33932 1726882896.66252: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 33932 1726882896.66321: no more pending results, returning what we have 33932 1726882896.66324: results queue empty 33932 1726882896.66325: checking for any_errors_fatal 33932 1726882896.66332: done checking for any_errors_fatal 33932 1726882896.66333: checking for max_fail_percentage 33932 1726882896.66335: done checking for max_fail_percentage 33932 1726882896.66336: checking to see if all hosts have failed and the running result is not ok 33932 1726882896.66337: done checking to see if all hosts have failed 33932 1726882896.66337: getting the remaining hosts for this loop 33932 1726882896.66339: done getting the remaining hosts for this loop 33932 1726882896.66343: getting the next task for host managed_node1 33932 1726882896.66351: done getting next task for host managed_node1 33932 1726882896.66353: ^ task is: TASK: meta (role_complete) 33932 1726882896.66356: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882896.66372: getting variables 33932 1726882896.66374: in VariableManager get_vars() 33932 1726882896.66416: Calling all_inventory to load vars for managed_node1 33932 1726882896.66419: Calling groups_inventory to load vars for managed_node1 33932 1726882896.66421: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.66431: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.66434: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.66436: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.68206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882896.70076: done with get_vars() 33932 1726882896.70103: done getting variables 33932 1726882896.70196: done queuing things up, now waiting for results queue to drain 33932 1726882896.70199: results queue empty 33932 1726882896.70199: checking for any_errors_fatal 33932 1726882896.70202: done checking for any_errors_fatal 33932 1726882896.70203: checking for max_fail_percentage 33932 1726882896.70204: done checking for max_fail_percentage 33932 1726882896.70205: checking to see if all hosts have failed and the running result is not ok 33932 1726882896.70206: done checking to see if all hosts have failed 33932 1726882896.70210: getting the remaining hosts for this loop 33932 1726882896.70211: done getting the remaining hosts for this loop 33932 1726882896.70214: getting the next task for host managed_node1 33932 1726882896.70218: done getting next task for host managed_node1 33932 1726882896.70221: ^ task is: TASK: Include the task 'assert_device_present.yml' 33932 1726882896.70222: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882896.70231: getting variables 33932 1726882896.70232: in VariableManager get_vars() 33932 1726882896.70247: Calling all_inventory to load vars for managed_node1 33932 1726882896.70249: Calling groups_inventory to load vars for managed_node1 33932 1726882896.70251: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.70256: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.70258: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.70261: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.71660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882896.73661: done with get_vars() 33932 1726882896.73690: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:46 Friday 20 September 2024 21:41:36 -0400 (0:00:00.586) 0:00:17.205 ****** 33932 1726882896.73776: entering _queue_task() for managed_node1/include_tasks 33932 1726882896.74177: worker is 1 (out of 1 available) 33932 1726882896.74195: exiting _queue_task() for managed_node1/include_tasks 33932 1726882896.74206: done queuing things up, now waiting for results queue to drain 33932 1726882896.74208: waiting for pending results... 33932 1726882896.74505: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 33932 1726882896.74613: in run() - task 0e448fcc-3ce9-615b-5c48-00000000005b 33932 1726882896.74639: variable 'ansible_search_path' from source: unknown 33932 1726882896.74690: calling self._execute() 33932 1726882896.74798: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.74809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.74821: variable 'omit' from source: magic vars 33932 1726882896.75223: variable 'ansible_distribution_major_version' from source: facts 33932 1726882896.75241: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882896.75253: _execute() done 33932 1726882896.75260: dumping result to json 33932 1726882896.75272: done dumping result, returning 33932 1726882896.75288: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-615b-5c48-00000000005b] 33932 1726882896.75302: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000005b 33932 1726882896.75431: no more pending results, returning what we have 33932 1726882896.75437: in VariableManager get_vars() 33932 1726882896.75488: Calling all_inventory to load vars for managed_node1 33932 1726882896.75491: Calling groups_inventory to load vars for managed_node1 33932 1726882896.75493: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.75507: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.75512: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.75515: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.76612: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000005b 33932 1726882896.76616: WORKER PROCESS EXITING 33932 1726882896.77271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882896.79345: done with get_vars() 33932 1726882896.79377: variable 'ansible_search_path' from source: unknown 33932 1726882896.79392: we have included files to process 33932 1726882896.79393: generating all_blocks data 33932 1726882896.79395: done generating all_blocks data 33932 1726882896.79400: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 33932 1726882896.79401: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 33932 1726882896.79403: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 33932 1726882896.79526: in VariableManager get_vars() 33932 1726882896.79549: done with get_vars() 33932 1726882896.79677: done processing included file 33932 1726882896.79680: iterating over new_blocks loaded from include file 33932 1726882896.79681: in VariableManager get_vars() 33932 1726882896.79708: done with get_vars() 33932 1726882896.79710: filtering new block on tags 33932 1726882896.79729: done filtering new block on tags 33932 1726882896.79731: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 33932 1726882896.79736: extending task lists for all hosts with included blocks 33932 1726882896.82729: done extending task lists 33932 1726882896.82736: done processing included files 33932 1726882896.82738: results queue empty 33932 1726882896.82738: checking for any_errors_fatal 33932 1726882896.82740: done checking for any_errors_fatal 33932 1726882896.82741: checking for max_fail_percentage 33932 1726882896.82742: done checking for max_fail_percentage 33932 1726882896.82743: checking to see if all hosts have failed and the running result is not ok 33932 1726882896.82744: done checking to see if all hosts have failed 33932 1726882896.82745: getting the remaining hosts for this loop 33932 1726882896.82746: done getting the remaining hosts for this loop 33932 1726882896.82755: getting the next task for host managed_node1 33932 1726882896.82760: done getting next task for host managed_node1 33932 1726882896.82762: ^ task is: TASK: Include the task 'get_interface_stat.yml' 33932 1726882896.82766: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882896.82771: getting variables 33932 1726882896.82772: in VariableManager get_vars() 33932 1726882896.82786: Calling all_inventory to load vars for managed_node1 33932 1726882896.82788: Calling groups_inventory to load vars for managed_node1 33932 1726882896.82791: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.82796: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.82799: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.82802: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.84241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882896.86089: done with get_vars() 33932 1726882896.86110: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:41:36 -0400 (0:00:00.124) 0:00:17.329 ****** 33932 1726882896.86197: entering _queue_task() for managed_node1/include_tasks 33932 1726882896.86542: worker is 1 (out of 1 available) 33932 1726882896.86555: exiting _queue_task() for managed_node1/include_tasks 33932 1726882896.86571: done queuing things up, now waiting for results queue to drain 33932 1726882896.86574: waiting for pending results... 33932 1726882896.86884: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 33932 1726882896.87003: in run() - task 0e448fcc-3ce9-615b-5c48-000000000578 33932 1726882896.87032: variable 'ansible_search_path' from source: unknown 33932 1726882896.87044: variable 'ansible_search_path' from source: unknown 33932 1726882896.87093: calling self._execute() 33932 1726882896.87191: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882896.87200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882896.87211: variable 'omit' from source: magic vars 33932 1726882896.87648: variable 'ansible_distribution_major_version' from source: facts 33932 1726882896.87707: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882896.87719: _execute() done 33932 1726882896.87728: dumping result to json 33932 1726882896.87736: done dumping result, returning 33932 1726882896.87746: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-615b-5c48-000000000578] 33932 1726882896.87756: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000578 33932 1726882896.87923: no more pending results, returning what we have 33932 1726882896.87929: in VariableManager get_vars() 33932 1726882896.87987: Calling all_inventory to load vars for managed_node1 33932 1726882896.87991: Calling groups_inventory to load vars for managed_node1 33932 1726882896.87993: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.88008: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.88012: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.88016: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.89132: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000578 33932 1726882896.89135: WORKER PROCESS EXITING 33932 1726882896.94897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882896.96849: done with get_vars() 33932 1726882896.96873: variable 'ansible_search_path' from source: unknown 33932 1726882896.96874: variable 'ansible_search_path' from source: unknown 33932 1726882896.96914: we have included files to process 33932 1726882896.96915: generating all_blocks data 33932 1726882896.96916: done generating all_blocks data 33932 1726882896.96917: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 33932 1726882896.96918: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 33932 1726882896.96925: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 33932 1726882896.97191: done processing included file 33932 1726882896.97193: iterating over new_blocks loaded from include file 33932 1726882896.97195: in VariableManager get_vars() 33932 1726882896.97215: done with get_vars() 33932 1726882896.97217: filtering new block on tags 33932 1726882896.97236: done filtering new block on tags 33932 1726882896.97238: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 33932 1726882896.97243: extending task lists for all hosts with included blocks 33932 1726882896.97423: done extending task lists 33932 1726882896.97424: done processing included files 33932 1726882896.97425: results queue empty 33932 1726882896.97426: checking for any_errors_fatal 33932 1726882896.97428: done checking for any_errors_fatal 33932 1726882896.97429: checking for max_fail_percentage 33932 1726882896.97430: done checking for max_fail_percentage 33932 1726882896.97431: checking to see if all hosts have failed and the running result is not ok 33932 1726882896.97432: done checking to see if all hosts have failed 33932 1726882896.97432: getting the remaining hosts for this loop 33932 1726882896.97434: done getting the remaining hosts for this loop 33932 1726882896.97436: getting the next task for host managed_node1 33932 1726882896.97467: done getting next task for host managed_node1 33932 1726882896.97474: ^ task is: TASK: Get stat for interface {{ interface }} 33932 1726882896.97478: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882896.97480: getting variables 33932 1726882896.97481: in VariableManager get_vars() 33932 1726882896.97494: Calling all_inventory to load vars for managed_node1 33932 1726882896.97496: Calling groups_inventory to load vars for managed_node1 33932 1726882896.97498: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882896.97503: Calling all_plugins_play to load vars for managed_node1 33932 1726882896.97506: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882896.97509: Calling groups_plugins_play to load vars for managed_node1 33932 1726882896.98827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.00166: done with get_vars() 33932 1726882897.00181: done getting variables 33932 1726882897.00283: variable 'interface' from source: include params 33932 1726882897.00285: variable 'vlan_interface' from source: play vars 33932 1726882897.00331: variable 'vlan_interface' from source: play vars TASK [Get stat for interface lsr101.90] **************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:41:37 -0400 (0:00:00.141) 0:00:17.471 ****** 33932 1726882897.00351: entering _queue_task() for managed_node1/stat 33932 1726882897.00593: worker is 1 (out of 1 available) 33932 1726882897.00606: exiting _queue_task() for managed_node1/stat 33932 1726882897.00618: done queuing things up, now waiting for results queue to drain 33932 1726882897.00620: waiting for pending results... 33932 1726882897.00810: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101.90 33932 1726882897.00885: in run() - task 0e448fcc-3ce9-615b-5c48-00000000069c 33932 1726882897.00895: variable 'ansible_search_path' from source: unknown 33932 1726882897.00899: variable 'ansible_search_path' from source: unknown 33932 1726882897.00928: calling self._execute() 33932 1726882897.01004: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.01007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.01015: variable 'omit' from source: magic vars 33932 1726882897.01341: variable 'ansible_distribution_major_version' from source: facts 33932 1726882897.01362: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882897.01380: variable 'omit' from source: magic vars 33932 1726882897.01431: variable 'omit' from source: magic vars 33932 1726882897.01556: variable 'interface' from source: include params 33932 1726882897.01561: variable 'vlan_interface' from source: play vars 33932 1726882897.01628: variable 'vlan_interface' from source: play vars 33932 1726882897.01646: variable 'omit' from source: magic vars 33932 1726882897.01688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882897.01729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882897.01750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882897.01767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.01781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.01814: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882897.01819: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.01821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.01917: Set connection var ansible_shell_executable to /bin/sh 33932 1726882897.01929: Set connection var ansible_timeout to 10 33932 1726882897.01934: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882897.01941: Set connection var ansible_pipelining to False 33932 1726882897.01944: Set connection var ansible_connection to ssh 33932 1726882897.01947: Set connection var ansible_shell_type to sh 33932 1726882897.01972: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.01976: variable 'ansible_connection' from source: unknown 33932 1726882897.01979: variable 'ansible_module_compression' from source: unknown 33932 1726882897.01981: variable 'ansible_shell_type' from source: unknown 33932 1726882897.01983: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.01986: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.01988: variable 'ansible_pipelining' from source: unknown 33932 1726882897.01990: variable 'ansible_timeout' from source: unknown 33932 1726882897.01992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.02184: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882897.02193: variable 'omit' from source: magic vars 33932 1726882897.02200: starting attempt loop 33932 1726882897.02203: running the handler 33932 1726882897.02217: _low_level_execute_command(): starting 33932 1726882897.02224: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882897.02879: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.02889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.02919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.02944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.02947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.02949: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.03008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.03013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.03015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.03116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.04793: stdout chunk (state=3): >>>/root <<< 33932 1726882897.04940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.04958: stderr chunk (state=3): >>><<< 33932 1726882897.04967: stdout chunk (state=3): >>><<< 33932 1726882897.04995: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882897.05004: _low_level_execute_command(): starting 33932 1726882897.05009: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997 `" && echo ansible-tmp-1726882897.049906-34750-11524269830997="` echo /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997 `" ) && sleep 0' 33932 1726882897.05583: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882897.05590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.05601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.05614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.05652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.05661: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882897.05671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.05682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882897.05691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882897.05697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882897.05706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.05714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.05726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.05734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.05740: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882897.05749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.05820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.05835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.05846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.05977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.07843: stdout chunk (state=3): >>>ansible-tmp-1726882897.049906-34750-11524269830997=/root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997 <<< 33932 1726882897.07954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.07999: stderr chunk (state=3): >>><<< 33932 1726882897.08005: stdout chunk (state=3): >>><<< 33932 1726882897.08017: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882897.049906-34750-11524269830997=/root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882897.08053: variable 'ansible_module_compression' from source: unknown 33932 1726882897.08102: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 33932 1726882897.08131: variable 'ansible_facts' from source: unknown 33932 1726882897.08207: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997/AnsiballZ_stat.py 33932 1726882897.08325: Sending initial data 33932 1726882897.08328: Sent initial data (151 bytes) 33932 1726882897.09456: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882897.09460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.09462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.09467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.09470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.09474: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882897.09477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.09479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882897.09481: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882897.09483: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882897.09485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.09488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.09490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.09492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.09494: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882897.09495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.09497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.09499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.09501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.09551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.11316: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882897.11408: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882897.11518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpju1hc0hw /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997/AnsiballZ_stat.py <<< 33932 1726882897.11607: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882897.12613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.12698: stderr chunk (state=3): >>><<< 33932 1726882897.12701: stdout chunk (state=3): >>><<< 33932 1726882897.12715: done transferring module to remote 33932 1726882897.12724: _low_level_execute_command(): starting 33932 1726882897.12728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997/ /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997/AnsiballZ_stat.py && sleep 0' 33932 1726882897.13120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.13127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.13159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.13172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.13178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.13195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882897.13197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.13247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.13251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.13350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.15115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.15147: stderr chunk (state=3): >>><<< 33932 1726882897.15150: stdout chunk (state=3): >>><<< 33932 1726882897.15165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882897.15169: _low_level_execute_command(): starting 33932 1726882897.15175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997/AnsiballZ_stat.py && sleep 0' 33932 1726882897.15571: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.15575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.15608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.15611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.15613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.15661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.15667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.15783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.29126: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31016, "dev": 21, "nlink": 1, "atime": 1726882895.6801782, "mtime": 1726882895.6801782, "ctime": 1726882895.6801782, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 33932 1726882897.30155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882897.30204: stderr chunk (state=3): >>><<< 33932 1726882897.30207: stdout chunk (state=3): >>><<< 33932 1726882897.30223: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31016, "dev": 21, "nlink": 1, "atime": 1726882895.6801782, "mtime": 1726882895.6801782, "ctime": 1726882895.6801782, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882897.30262: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882897.30273: _low_level_execute_command(): starting 33932 1726882897.30281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882897.049906-34750-11524269830997/ > /dev/null 2>&1 && sleep 0' 33932 1726882897.30713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.30720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.30753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.30758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882897.30767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.30773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.30792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882897.30795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.30837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.30855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.30961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.32794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.32834: stderr chunk (state=3): >>><<< 33932 1726882897.32837: stdout chunk (state=3): >>><<< 33932 1726882897.32849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882897.32854: handler run complete 33932 1726882897.32894: attempt loop complete, returning result 33932 1726882897.32897: _execute() done 33932 1726882897.32899: dumping result to json 33932 1726882897.32903: done dumping result, returning 33932 1726882897.32910: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101.90 [0e448fcc-3ce9-615b-5c48-00000000069c] 33932 1726882897.32914: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000069c 33932 1726882897.33015: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000069c 33932 1726882897.33018: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882895.6801782, "block_size": 4096, "blocks": 0, "ctime": 1726882895.6801782, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31016, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "mode": "0777", "mtime": 1726882895.6801782, "nlink": 1, "path": "/sys/class/net/lsr101.90", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 33932 1726882897.33105: no more pending results, returning what we have 33932 1726882897.33109: results queue empty 33932 1726882897.33111: checking for any_errors_fatal 33932 1726882897.33112: done checking for any_errors_fatal 33932 1726882897.33113: checking for max_fail_percentage 33932 1726882897.33115: done checking for max_fail_percentage 33932 1726882897.33116: checking to see if all hosts have failed and the running result is not ok 33932 1726882897.33116: done checking to see if all hosts have failed 33932 1726882897.33117: getting the remaining hosts for this loop 33932 1726882897.33119: done getting the remaining hosts for this loop 33932 1726882897.33122: getting the next task for host managed_node1 33932 1726882897.33129: done getting next task for host managed_node1 33932 1726882897.33132: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 33932 1726882897.33135: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882897.33140: getting variables 33932 1726882897.33141: in VariableManager get_vars() 33932 1726882897.33185: Calling all_inventory to load vars for managed_node1 33932 1726882897.33188: Calling groups_inventory to load vars for managed_node1 33932 1726882897.33190: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882897.33200: Calling all_plugins_play to load vars for managed_node1 33932 1726882897.33203: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882897.33205: Calling groups_plugins_play to load vars for managed_node1 33932 1726882897.34037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.35002: done with get_vars() 33932 1726882897.35017: done getting variables 33932 1726882897.35059: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882897.35153: variable 'interface' from source: include params 33932 1726882897.35156: variable 'vlan_interface' from source: play vars 33932 1726882897.35203: variable 'vlan_interface' from source: play vars TASK [Assert that the interface is present - 'lsr101.90'] ********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:41:37 -0400 (0:00:00.348) 0:00:17.819 ****** 33932 1726882897.35226: entering _queue_task() for managed_node1/assert 33932 1726882897.35445: worker is 1 (out of 1 available) 33932 1726882897.35458: exiting _queue_task() for managed_node1/assert 33932 1726882897.35474: done queuing things up, now waiting for results queue to drain 33932 1726882897.35477: waiting for pending results... 33932 1726882897.35654: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101.90' 33932 1726882897.35729: in run() - task 0e448fcc-3ce9-615b-5c48-000000000579 33932 1726882897.35744: variable 'ansible_search_path' from source: unknown 33932 1726882897.35747: variable 'ansible_search_path' from source: unknown 33932 1726882897.35777: calling self._execute() 33932 1726882897.35851: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.35856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.35865: variable 'omit' from source: magic vars 33932 1726882897.36133: variable 'ansible_distribution_major_version' from source: facts 33932 1726882897.36143: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882897.36149: variable 'omit' from source: magic vars 33932 1726882897.36179: variable 'omit' from source: magic vars 33932 1726882897.36245: variable 'interface' from source: include params 33932 1726882897.36249: variable 'vlan_interface' from source: play vars 33932 1726882897.36298: variable 'vlan_interface' from source: play vars 33932 1726882897.36313: variable 'omit' from source: magic vars 33932 1726882897.36348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882897.36378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882897.36395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882897.36411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.36421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.36443: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882897.36446: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.36449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.36524: Set connection var ansible_shell_executable to /bin/sh 33932 1726882897.36530: Set connection var ansible_timeout to 10 33932 1726882897.36533: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882897.36539: Set connection var ansible_pipelining to False 33932 1726882897.36542: Set connection var ansible_connection to ssh 33932 1726882897.36545: Set connection var ansible_shell_type to sh 33932 1726882897.36562: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.36566: variable 'ansible_connection' from source: unknown 33932 1726882897.36572: variable 'ansible_module_compression' from source: unknown 33932 1726882897.36575: variable 'ansible_shell_type' from source: unknown 33932 1726882897.36578: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.36581: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.36583: variable 'ansible_pipelining' from source: unknown 33932 1726882897.36585: variable 'ansible_timeout' from source: unknown 33932 1726882897.36587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.36689: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882897.36699: variable 'omit' from source: magic vars 33932 1726882897.36704: starting attempt loop 33932 1726882897.36707: running the handler 33932 1726882897.36799: variable 'interface_stat' from source: set_fact 33932 1726882897.36814: Evaluated conditional (interface_stat.stat.exists): True 33932 1726882897.36824: handler run complete 33932 1726882897.36836: attempt loop complete, returning result 33932 1726882897.36839: _execute() done 33932 1726882897.36841: dumping result to json 33932 1726882897.36844: done dumping result, returning 33932 1726882897.36846: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101.90' [0e448fcc-3ce9-615b-5c48-000000000579] 33932 1726882897.36855: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000579 33932 1726882897.36944: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000579 33932 1726882897.36947: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882897.36999: no more pending results, returning what we have 33932 1726882897.37004: results queue empty 33932 1726882897.37005: checking for any_errors_fatal 33932 1726882897.37014: done checking for any_errors_fatal 33932 1726882897.37014: checking for max_fail_percentage 33932 1726882897.37016: done checking for max_fail_percentage 33932 1726882897.37017: checking to see if all hosts have failed and the running result is not ok 33932 1726882897.37017: done checking to see if all hosts have failed 33932 1726882897.37018: getting the remaining hosts for this loop 33932 1726882897.37020: done getting the remaining hosts for this loop 33932 1726882897.37023: getting the next task for host managed_node1 33932 1726882897.37030: done getting next task for host managed_node1 33932 1726882897.37033: ^ task is: TASK: Include the task 'assert_profile_present.yml' 33932 1726882897.37039: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882897.37044: getting variables 33932 1726882897.37045: in VariableManager get_vars() 33932 1726882897.37092: Calling all_inventory to load vars for managed_node1 33932 1726882897.37095: Calling groups_inventory to load vars for managed_node1 33932 1726882897.37098: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882897.37107: Calling all_plugins_play to load vars for managed_node1 33932 1726882897.37109: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882897.37112: Calling groups_plugins_play to load vars for managed_node1 33932 1726882897.38033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.38959: done with get_vars() 33932 1726882897.38976: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:50 Friday 20 September 2024 21:41:37 -0400 (0:00:00.038) 0:00:17.858 ****** 33932 1726882897.39041: entering _queue_task() for managed_node1/include_tasks 33932 1726882897.39254: worker is 1 (out of 1 available) 33932 1726882897.39267: exiting _queue_task() for managed_node1/include_tasks 33932 1726882897.39277: done queuing things up, now waiting for results queue to drain 33932 1726882897.39279: waiting for pending results... 33932 1726882897.39503: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 33932 1726882897.39580: in run() - task 0e448fcc-3ce9-615b-5c48-00000000005c 33932 1726882897.39591: variable 'ansible_search_path' from source: unknown 33932 1726882897.39631: variable 'interface' from source: play vars 33932 1726882897.39778: variable 'interface' from source: play vars 33932 1726882897.39789: variable 'vlan_interface' from source: play vars 33932 1726882897.39837: variable 'vlan_interface' from source: play vars 33932 1726882897.39849: variable 'omit' from source: magic vars 33932 1726882897.39951: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.39957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.39973: variable 'omit' from source: magic vars 33932 1726882897.40137: variable 'ansible_distribution_major_version' from source: facts 33932 1726882897.40145: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882897.40172: variable 'item' from source: unknown 33932 1726882897.40217: variable 'item' from source: unknown 33932 1726882897.40352: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.40355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.40358: variable 'omit' from source: magic vars 33932 1726882897.40430: variable 'ansible_distribution_major_version' from source: facts 33932 1726882897.40434: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882897.40452: variable 'item' from source: unknown 33932 1726882897.40500: variable 'item' from source: unknown 33932 1726882897.40560: dumping result to json 33932 1726882897.40565: done dumping result, returning 33932 1726882897.40570: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-615b-5c48-00000000005c] 33932 1726882897.40573: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000005c 33932 1726882897.40618: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000005c 33932 1726882897.40620: WORKER PROCESS EXITING 33932 1726882897.40642: no more pending results, returning what we have 33932 1726882897.40647: in VariableManager get_vars() 33932 1726882897.40702: Calling all_inventory to load vars for managed_node1 33932 1726882897.40705: Calling groups_inventory to load vars for managed_node1 33932 1726882897.40707: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882897.40717: Calling all_plugins_play to load vars for managed_node1 33932 1726882897.40719: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882897.40722: Calling groups_plugins_play to load vars for managed_node1 33932 1726882897.41993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.43192: done with get_vars() 33932 1726882897.43204: variable 'ansible_search_path' from source: unknown 33932 1726882897.43215: variable 'ansible_search_path' from source: unknown 33932 1726882897.43222: we have included files to process 33932 1726882897.43223: generating all_blocks data 33932 1726882897.43224: done generating all_blocks data 33932 1726882897.43228: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 33932 1726882897.43229: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 33932 1726882897.43230: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 33932 1726882897.43359: in VariableManager get_vars() 33932 1726882897.43377: done with get_vars() 33932 1726882897.43543: done processing included file 33932 1726882897.43545: iterating over new_blocks loaded from include file 33932 1726882897.43546: in VariableManager get_vars() 33932 1726882897.43559: done with get_vars() 33932 1726882897.43561: filtering new block on tags 33932 1726882897.43577: done filtering new block on tags 33932 1726882897.43578: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=lsr101) 33932 1726882897.43581: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 33932 1726882897.43582: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 33932 1726882897.43584: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 33932 1726882897.43641: in VariableManager get_vars() 33932 1726882897.43655: done with get_vars() 33932 1726882897.43806: done processing included file 33932 1726882897.43808: iterating over new_blocks loaded from include file 33932 1726882897.43809: in VariableManager get_vars() 33932 1726882897.43819: done with get_vars() 33932 1726882897.43820: filtering new block on tags 33932 1726882897.43831: done filtering new block on tags 33932 1726882897.43832: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=lsr101.90) 33932 1726882897.43834: extending task lists for all hosts with included blocks 33932 1726882897.46427: done extending task lists 33932 1726882897.46429: done processing included files 33932 1726882897.46429: results queue empty 33932 1726882897.46430: checking for any_errors_fatal 33932 1726882897.46433: done checking for any_errors_fatal 33932 1726882897.46434: checking for max_fail_percentage 33932 1726882897.46435: done checking for max_fail_percentage 33932 1726882897.46436: checking to see if all hosts have failed and the running result is not ok 33932 1726882897.46437: done checking to see if all hosts have failed 33932 1726882897.46437: getting the remaining hosts for this loop 33932 1726882897.46439: done getting the remaining hosts for this loop 33932 1726882897.46441: getting the next task for host managed_node1 33932 1726882897.46445: done getting next task for host managed_node1 33932 1726882897.46447: ^ task is: TASK: Include the task 'get_profile_stat.yml' 33932 1726882897.46449: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882897.46451: getting variables 33932 1726882897.46452: in VariableManager get_vars() 33932 1726882897.46466: Calling all_inventory to load vars for managed_node1 33932 1726882897.46470: Calling groups_inventory to load vars for managed_node1 33932 1726882897.46472: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882897.46477: Calling all_plugins_play to load vars for managed_node1 33932 1726882897.46480: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882897.46487: Calling groups_plugins_play to load vars for managed_node1 33932 1726882897.47701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.49384: done with get_vars() 33932 1726882897.49429: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:41:37 -0400 (0:00:00.104) 0:00:17.962 ****** 33932 1726882897.49505: entering _queue_task() for managed_node1/include_tasks 33932 1726882897.49857: worker is 1 (out of 1 available) 33932 1726882897.49877: exiting _queue_task() for managed_node1/include_tasks 33932 1726882897.49891: done queuing things up, now waiting for results queue to drain 33932 1726882897.49893: waiting for pending results... 33932 1726882897.50199: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 33932 1726882897.50314: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006b8 33932 1726882897.50348: variable 'ansible_search_path' from source: unknown 33932 1726882897.50356: variable 'ansible_search_path' from source: unknown 33932 1726882897.50401: calling self._execute() 33932 1726882897.50512: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.50525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.50545: variable 'omit' from source: magic vars 33932 1726882897.50941: variable 'ansible_distribution_major_version' from source: facts 33932 1726882897.50960: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882897.50978: _execute() done 33932 1726882897.50992: dumping result to json 33932 1726882897.51000: done dumping result, returning 33932 1726882897.51009: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-615b-5c48-0000000006b8] 33932 1726882897.51020: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006b8 33932 1726882897.51143: no more pending results, returning what we have 33932 1726882897.51149: in VariableManager get_vars() 33932 1726882897.51202: Calling all_inventory to load vars for managed_node1 33932 1726882897.51205: Calling groups_inventory to load vars for managed_node1 33932 1726882897.51208: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882897.51222: Calling all_plugins_play to load vars for managed_node1 33932 1726882897.51226: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882897.51229: Calling groups_plugins_play to load vars for managed_node1 33932 1726882897.52285: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006b8 33932 1726882897.52288: WORKER PROCESS EXITING 33932 1726882897.53007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.54907: done with get_vars() 33932 1726882897.54926: variable 'ansible_search_path' from source: unknown 33932 1726882897.54937: variable 'ansible_search_path' from source: unknown 33932 1726882897.54976: we have included files to process 33932 1726882897.54977: generating all_blocks data 33932 1726882897.54979: done generating all_blocks data 33932 1726882897.54980: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 33932 1726882897.54981: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 33932 1726882897.54983: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 33932 1726882897.56124: done processing included file 33932 1726882897.56126: iterating over new_blocks loaded from include file 33932 1726882897.56127: in VariableManager get_vars() 33932 1726882897.56144: done with get_vars() 33932 1726882897.56146: filtering new block on tags 33932 1726882897.56171: done filtering new block on tags 33932 1726882897.56173: in VariableManager get_vars() 33932 1726882897.56189: done with get_vars() 33932 1726882897.56191: filtering new block on tags 33932 1726882897.56209: done filtering new block on tags 33932 1726882897.56211: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 33932 1726882897.56216: extending task lists for all hosts with included blocks 33932 1726882897.56376: done extending task lists 33932 1726882897.56378: done processing included files 33932 1726882897.56379: results queue empty 33932 1726882897.56380: checking for any_errors_fatal 33932 1726882897.56383: done checking for any_errors_fatal 33932 1726882897.56384: checking for max_fail_percentage 33932 1726882897.56385: done checking for max_fail_percentage 33932 1726882897.56385: checking to see if all hosts have failed and the running result is not ok 33932 1726882897.56386: done checking to see if all hosts have failed 33932 1726882897.56387: getting the remaining hosts for this loop 33932 1726882897.56388: done getting the remaining hosts for this loop 33932 1726882897.56391: getting the next task for host managed_node1 33932 1726882897.56394: done getting next task for host managed_node1 33932 1726882897.56396: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 33932 1726882897.56399: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882897.56401: getting variables 33932 1726882897.56402: in VariableManager get_vars() 33932 1726882897.56454: Calling all_inventory to load vars for managed_node1 33932 1726882897.56456: Calling groups_inventory to load vars for managed_node1 33932 1726882897.56458: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882897.56465: Calling all_plugins_play to load vars for managed_node1 33932 1726882897.56470: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882897.56473: Calling groups_plugins_play to load vars for managed_node1 33932 1726882897.57671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.59408: done with get_vars() 33932 1726882897.59429: done getting variables 33932 1726882897.59476: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:41:37 -0400 (0:00:00.099) 0:00:18.062 ****** 33932 1726882897.59506: entering _queue_task() for managed_node1/set_fact 33932 1726882897.59854: worker is 1 (out of 1 available) 33932 1726882897.59870: exiting _queue_task() for managed_node1/set_fact 33932 1726882897.59883: done queuing things up, now waiting for results queue to drain 33932 1726882897.59885: waiting for pending results... 33932 1726882897.60241: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 33932 1726882897.60379: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f0 33932 1726882897.60403: variable 'ansible_search_path' from source: unknown 33932 1726882897.60411: variable 'ansible_search_path' from source: unknown 33932 1726882897.60460: calling self._execute() 33932 1726882897.60582: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.60595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.60614: variable 'omit' from source: magic vars 33932 1726882897.61151: variable 'ansible_distribution_major_version' from source: facts 33932 1726882897.61175: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882897.61186: variable 'omit' from source: magic vars 33932 1726882897.61240: variable 'omit' from source: magic vars 33932 1726882897.61287: variable 'omit' from source: magic vars 33932 1726882897.61337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882897.61382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882897.61406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882897.61434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.61449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.61486: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882897.61495: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.61502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.61607: Set connection var ansible_shell_executable to /bin/sh 33932 1726882897.61617: Set connection var ansible_timeout to 10 33932 1726882897.61625: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882897.61632: Set connection var ansible_pipelining to False 33932 1726882897.61642: Set connection var ansible_connection to ssh 33932 1726882897.61647: Set connection var ansible_shell_type to sh 33932 1726882897.61680: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.61688: variable 'ansible_connection' from source: unknown 33932 1726882897.61696: variable 'ansible_module_compression' from source: unknown 33932 1726882897.61703: variable 'ansible_shell_type' from source: unknown 33932 1726882897.61711: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.61718: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.61725: variable 'ansible_pipelining' from source: unknown 33932 1726882897.61732: variable 'ansible_timeout' from source: unknown 33932 1726882897.61741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.61881: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882897.61900: variable 'omit' from source: magic vars 33932 1726882897.61912: starting attempt loop 33932 1726882897.61919: running the handler 33932 1726882897.61937: handler run complete 33932 1726882897.61951: attempt loop complete, returning result 33932 1726882897.61958: _execute() done 33932 1726882897.61974: dumping result to json 33932 1726882897.61982: done dumping result, returning 33932 1726882897.61992: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-615b-5c48-0000000007f0] 33932 1726882897.62001: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f0 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 33932 1726882897.62145: no more pending results, returning what we have 33932 1726882897.62149: results queue empty 33932 1726882897.62150: checking for any_errors_fatal 33932 1726882897.62152: done checking for any_errors_fatal 33932 1726882897.62153: checking for max_fail_percentage 33932 1726882897.62154: done checking for max_fail_percentage 33932 1726882897.62155: checking to see if all hosts have failed and the running result is not ok 33932 1726882897.62156: done checking to see if all hosts have failed 33932 1726882897.62157: getting the remaining hosts for this loop 33932 1726882897.62159: done getting the remaining hosts for this loop 33932 1726882897.62163: getting the next task for host managed_node1 33932 1726882897.62174: done getting next task for host managed_node1 33932 1726882897.62177: ^ task is: TASK: Stat profile file 33932 1726882897.62181: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882897.62186: getting variables 33932 1726882897.62188: in VariableManager get_vars() 33932 1726882897.62232: Calling all_inventory to load vars for managed_node1 33932 1726882897.62235: Calling groups_inventory to load vars for managed_node1 33932 1726882897.62238: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882897.62250: Calling all_plugins_play to load vars for managed_node1 33932 1726882897.62253: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882897.62256: Calling groups_plugins_play to load vars for managed_node1 33932 1726882897.63886: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f0 33932 1726882897.63890: WORKER PROCESS EXITING 33932 1726882897.64056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882897.65774: done with get_vars() 33932 1726882897.65800: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:41:37 -0400 (0:00:00.063) 0:00:18.126 ****** 33932 1726882897.65918: entering _queue_task() for managed_node1/stat 33932 1726882897.66260: worker is 1 (out of 1 available) 33932 1726882897.66280: exiting _queue_task() for managed_node1/stat 33932 1726882897.66292: done queuing things up, now waiting for results queue to drain 33932 1726882897.66295: waiting for pending results... 33932 1726882897.66604: running TaskExecutor() for managed_node1/TASK: Stat profile file 33932 1726882897.66849: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f1 33932 1726882897.66875: variable 'ansible_search_path' from source: unknown 33932 1726882897.66883: variable 'ansible_search_path' from source: unknown 33932 1726882897.66931: calling self._execute() 33932 1726882897.67045: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.67056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.67074: variable 'omit' from source: magic vars 33932 1726882897.67466: variable 'ansible_distribution_major_version' from source: facts 33932 1726882897.67490: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882897.67501: variable 'omit' from source: magic vars 33932 1726882897.67543: variable 'omit' from source: magic vars 33932 1726882897.67643: variable 'profile' from source: include params 33932 1726882897.67653: variable 'item' from source: include params 33932 1726882897.67725: variable 'item' from source: include params 33932 1726882897.67745: variable 'omit' from source: magic vars 33932 1726882897.67791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882897.67832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882897.67855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882897.67882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.67897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882897.67932: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882897.67940: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.67947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.68054: Set connection var ansible_shell_executable to /bin/sh 33932 1726882897.68072: Set connection var ansible_timeout to 10 33932 1726882897.68083: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882897.68091: Set connection var ansible_pipelining to False 33932 1726882897.68097: Set connection var ansible_connection to ssh 33932 1726882897.68103: Set connection var ansible_shell_type to sh 33932 1726882897.68135: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.68142: variable 'ansible_connection' from source: unknown 33932 1726882897.68149: variable 'ansible_module_compression' from source: unknown 33932 1726882897.68155: variable 'ansible_shell_type' from source: unknown 33932 1726882897.68161: variable 'ansible_shell_executable' from source: unknown 33932 1726882897.68171: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882897.68179: variable 'ansible_pipelining' from source: unknown 33932 1726882897.68185: variable 'ansible_timeout' from source: unknown 33932 1726882897.68191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882897.68396: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882897.68411: variable 'omit' from source: magic vars 33932 1726882897.68420: starting attempt loop 33932 1726882897.68426: running the handler 33932 1726882897.68442: _low_level_execute_command(): starting 33932 1726882897.68459: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882897.69243: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882897.69258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.69280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.69298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.69346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.69358: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882897.69377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.69395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882897.69405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882897.69415: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882897.69425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.69441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.69456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.69473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.69485: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882897.69498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.69584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.69606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.69620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.69748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.71417: stdout chunk (state=3): >>>/root <<< 33932 1726882897.71614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.71617: stdout chunk (state=3): >>><<< 33932 1726882897.71620: stderr chunk (state=3): >>><<< 33932 1726882897.71733: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882897.71737: _low_level_execute_command(): starting 33932 1726882897.71751: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223 `" && echo ansible-tmp-1726882897.7163992-34778-216253237626223="` echo /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223 `" ) && sleep 0' 33932 1726882897.72336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882897.72351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.72372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.72392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.72434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.72448: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882897.72462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.72486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882897.72499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882897.72511: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882897.72524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.72538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.72554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.72573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.72587: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882897.72601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.72682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.72705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.72723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.72846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.74722: stdout chunk (state=3): >>>ansible-tmp-1726882897.7163992-34778-216253237626223=/root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223 <<< 33932 1726882897.74836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.74920: stderr chunk (state=3): >>><<< 33932 1726882897.74935: stdout chunk (state=3): >>><<< 33932 1726882897.75179: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882897.7163992-34778-216253237626223=/root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882897.75183: variable 'ansible_module_compression' from source: unknown 33932 1726882897.75185: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 33932 1726882897.75187: variable 'ansible_facts' from source: unknown 33932 1726882897.75217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223/AnsiballZ_stat.py 33932 1726882897.75380: Sending initial data 33932 1726882897.75385: Sent initial data (153 bytes) 33932 1726882897.76461: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882897.76488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.76505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.76522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.76565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.76581: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882897.76606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.76623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882897.76634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882897.76645: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882897.76656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.76673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.76690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.76711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.76724: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882897.76737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.76827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.76850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.76867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.76994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.78815: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882897.78909: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882897.79003: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpysmw3icy /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223/AnsiballZ_stat.py <<< 33932 1726882897.79092: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882897.80410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.80570: stderr chunk (state=3): >>><<< 33932 1726882897.80698: stdout chunk (state=3): >>><<< 33932 1726882897.80701: done transferring module to remote 33932 1726882897.80704: _low_level_execute_command(): starting 33932 1726882897.80710: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223/ /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223/AnsiballZ_stat.py && sleep 0' 33932 1726882897.81155: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.81159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.81168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.81177: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882897.81182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.81191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882897.81198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882897.81203: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.81210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.81219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.81226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.81288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.81298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.81402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.83189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882897.83288: stderr chunk (state=3): >>><<< 33932 1726882897.83294: stdout chunk (state=3): >>><<< 33932 1726882897.83307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882897.83312: _low_level_execute_command(): starting 33932 1726882897.83315: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223/AnsiballZ_stat.py && sleep 0' 33932 1726882897.83786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.83792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.83820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.83825: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882897.83834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.83844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882897.83847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882897.83853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.83863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.83879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.83887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882897.83892: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882897.83897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.83944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.83960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882897.83970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.84084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882897.97252: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 33932 1726882897.98235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882897.98294: stderr chunk (state=3): >>><<< 33932 1726882897.98297: stdout chunk (state=3): >>><<< 33932 1726882897.98313: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882897.98338: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882897.98349: _low_level_execute_command(): starting 33932 1726882897.98354: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882897.7163992-34778-216253237626223/ > /dev/null 2>&1 && sleep 0' 33932 1726882897.98821: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.98825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882897.98859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.98870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882897.98887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882897.98893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882897.98940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882897.98952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882897.99057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882898.00902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882898.00951: stderr chunk (state=3): >>><<< 33932 1726882898.00954: stdout chunk (state=3): >>><<< 33932 1726882898.00972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882898.00978: handler run complete 33932 1726882898.00993: attempt loop complete, returning result 33932 1726882898.00996: _execute() done 33932 1726882898.00998: dumping result to json 33932 1726882898.01001: done dumping result, returning 33932 1726882898.01009: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-615b-5c48-0000000007f1] 33932 1726882898.01014: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f1 33932 1726882898.01113: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f1 33932 1726882898.01116: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 33932 1726882898.01190: no more pending results, returning what we have 33932 1726882898.01193: results queue empty 33932 1726882898.01194: checking for any_errors_fatal 33932 1726882898.01202: done checking for any_errors_fatal 33932 1726882898.01203: checking for max_fail_percentage 33932 1726882898.01205: done checking for max_fail_percentage 33932 1726882898.01206: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.01207: done checking to see if all hosts have failed 33932 1726882898.01208: getting the remaining hosts for this loop 33932 1726882898.01209: done getting the remaining hosts for this loop 33932 1726882898.01213: getting the next task for host managed_node1 33932 1726882898.01218: done getting next task for host managed_node1 33932 1726882898.01221: ^ task is: TASK: Set NM profile exist flag based on the profile files 33932 1726882898.01226: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.01230: getting variables 33932 1726882898.01231: in VariableManager get_vars() 33932 1726882898.01281: Calling all_inventory to load vars for managed_node1 33932 1726882898.01284: Calling groups_inventory to load vars for managed_node1 33932 1726882898.01287: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.01297: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.01299: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.01301: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.02278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.03215: done with get_vars() 33932 1726882898.03233: done getting variables 33932 1726882898.03282: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:41:38 -0400 (0:00:00.373) 0:00:18.500 ****** 33932 1726882898.03303: entering _queue_task() for managed_node1/set_fact 33932 1726882898.03520: worker is 1 (out of 1 available) 33932 1726882898.03534: exiting _queue_task() for managed_node1/set_fact 33932 1726882898.03546: done queuing things up, now waiting for results queue to drain 33932 1726882898.03548: waiting for pending results... 33932 1726882898.03721: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 33932 1726882898.03795: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f2 33932 1726882898.03806: variable 'ansible_search_path' from source: unknown 33932 1726882898.03810: variable 'ansible_search_path' from source: unknown 33932 1726882898.03839: calling self._execute() 33932 1726882898.03920: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.03924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.03932: variable 'omit' from source: magic vars 33932 1726882898.04205: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.04218: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.04304: variable 'profile_stat' from source: set_fact 33932 1726882898.04317: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882898.04321: when evaluation is False, skipping this task 33932 1726882898.04324: _execute() done 33932 1726882898.04326: dumping result to json 33932 1726882898.04329: done dumping result, returning 33932 1726882898.04332: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-615b-5c48-0000000007f2] 33932 1726882898.04339: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f2 33932 1726882898.04420: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f2 33932 1726882898.04423: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882898.04495: no more pending results, returning what we have 33932 1726882898.04499: results queue empty 33932 1726882898.04500: checking for any_errors_fatal 33932 1726882898.04507: done checking for any_errors_fatal 33932 1726882898.04507: checking for max_fail_percentage 33932 1726882898.04509: done checking for max_fail_percentage 33932 1726882898.04510: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.04511: done checking to see if all hosts have failed 33932 1726882898.04511: getting the remaining hosts for this loop 33932 1726882898.04513: done getting the remaining hosts for this loop 33932 1726882898.04516: getting the next task for host managed_node1 33932 1726882898.04521: done getting next task for host managed_node1 33932 1726882898.04523: ^ task is: TASK: Get NM profile info 33932 1726882898.04527: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.04530: getting variables 33932 1726882898.04535: in VariableManager get_vars() 33932 1726882898.04576: Calling all_inventory to load vars for managed_node1 33932 1726882898.04578: Calling groups_inventory to load vars for managed_node1 33932 1726882898.04580: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.04588: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.04590: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.04592: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.05387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.06325: done with get_vars() 33932 1726882898.06341: done getting variables 33932 1726882898.06416: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:41:38 -0400 (0:00:00.031) 0:00:18.532 ****** 33932 1726882898.06437: entering _queue_task() for managed_node1/shell 33932 1726882898.06438: Creating lock for shell 33932 1726882898.06672: worker is 1 (out of 1 available) 33932 1726882898.06690: exiting _queue_task() for managed_node1/shell 33932 1726882898.06701: done queuing things up, now waiting for results queue to drain 33932 1726882898.06703: waiting for pending results... 33932 1726882898.06879: running TaskExecutor() for managed_node1/TASK: Get NM profile info 33932 1726882898.06953: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f3 33932 1726882898.06965: variable 'ansible_search_path' from source: unknown 33932 1726882898.06969: variable 'ansible_search_path' from source: unknown 33932 1726882898.07000: calling self._execute() 33932 1726882898.07080: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.07084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.07092: variable 'omit' from source: magic vars 33932 1726882898.07356: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.07368: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.07379: variable 'omit' from source: magic vars 33932 1726882898.07408: variable 'omit' from source: magic vars 33932 1726882898.07484: variable 'profile' from source: include params 33932 1726882898.07487: variable 'item' from source: include params 33932 1726882898.07532: variable 'item' from source: include params 33932 1726882898.07546: variable 'omit' from source: magic vars 33932 1726882898.07585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882898.07611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882898.07627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882898.07639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.07649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.07676: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882898.07679: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.07682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.07749: Set connection var ansible_shell_executable to /bin/sh 33932 1726882898.07756: Set connection var ansible_timeout to 10 33932 1726882898.07761: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882898.07767: Set connection var ansible_pipelining to False 33932 1726882898.07770: Set connection var ansible_connection to ssh 33932 1726882898.07775: Set connection var ansible_shell_type to sh 33932 1726882898.07794: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.07797: variable 'ansible_connection' from source: unknown 33932 1726882898.07799: variable 'ansible_module_compression' from source: unknown 33932 1726882898.07802: variable 'ansible_shell_type' from source: unknown 33932 1726882898.07805: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.07807: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.07810: variable 'ansible_pipelining' from source: unknown 33932 1726882898.07812: variable 'ansible_timeout' from source: unknown 33932 1726882898.07814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.07911: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882898.07919: variable 'omit' from source: magic vars 33932 1726882898.07925: starting attempt loop 33932 1726882898.07928: running the handler 33932 1726882898.07940: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882898.07952: _low_level_execute_command(): starting 33932 1726882898.07960: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882898.08497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882898.08518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882898.08531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882898.08542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.08587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882898.08612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882898.08714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882898.10371: stdout chunk (state=3): >>>/root <<< 33932 1726882898.10472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882898.10518: stderr chunk (state=3): >>><<< 33932 1726882898.10521: stdout chunk (state=3): >>><<< 33932 1726882898.10545: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882898.10555: _low_level_execute_command(): starting 33932 1726882898.10561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639 `" && echo ansible-tmp-1726882898.1054423-34794-279633395262639="` echo /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639 `" ) && sleep 0' 33932 1726882898.11004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882898.11015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882898.11042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.11046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882898.11048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.11107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882898.11110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882898.11212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882898.13074: stdout chunk (state=3): >>>ansible-tmp-1726882898.1054423-34794-279633395262639=/root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639 <<< 33932 1726882898.13188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882898.13237: stderr chunk (state=3): >>><<< 33932 1726882898.13240: stdout chunk (state=3): >>><<< 33932 1726882898.13254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882898.1054423-34794-279633395262639=/root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882898.13279: variable 'ansible_module_compression' from source: unknown 33932 1726882898.13317: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882898.13348: variable 'ansible_facts' from source: unknown 33932 1726882898.13400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639/AnsiballZ_command.py 33932 1726882898.13499: Sending initial data 33932 1726882898.13503: Sent initial data (156 bytes) 33932 1726882898.14141: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882898.14147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882898.14181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.14192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.14250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882898.14255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882898.14360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882898.16129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882898.16217: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882898.16311: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpmb8774jk /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639/AnsiballZ_command.py <<< 33932 1726882898.16401: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882898.17413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882898.17523: stderr chunk (state=3): >>><<< 33932 1726882898.17526: stdout chunk (state=3): >>><<< 33932 1726882898.17548: done transferring module to remote 33932 1726882898.17560: _low_level_execute_command(): starting 33932 1726882898.17566: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639/ /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639/AnsiballZ_command.py && sleep 0' 33932 1726882898.18024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882898.18030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882898.18061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.18076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.18131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882898.18148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882898.18244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882898.20072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882898.20145: stderr chunk (state=3): >>><<< 33932 1726882898.20148: stdout chunk (state=3): >>><<< 33932 1726882898.20255: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882898.20261: _low_level_execute_command(): starting 33932 1726882898.20263: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639/AnsiballZ_command.py && sleep 0' 33932 1726882898.20902: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882898.20930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882898.20946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882898.20967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882898.21012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882898.21054: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882898.21057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 33932 1726882898.21059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882898.21061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.21107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882898.21116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882898.21230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882898.36570: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-20 21:41:38.343497", "end": "2024-09-20 21:41:38.364204", "delta": "0:00:00.020707", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882898.38081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882898.38085: stdout chunk (state=3): >>><<< 33932 1726882898.38088: stderr chunk (state=3): >>><<< 33932 1726882898.38090: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-20 21:41:38.343497", "end": "2024-09-20 21:41:38.364204", "delta": "0:00:00.020707", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882898.38093: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882898.38096: _low_level_execute_command(): starting 33932 1726882898.38098: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882898.1054423-34794-279633395262639/ > /dev/null 2>&1 && sleep 0' 33932 1726882898.38687: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882898.38702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882898.38718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882898.38741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882898.38788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882898.38801: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882898.38815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.38833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882898.38846: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882898.38857: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882898.38877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882898.38892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882898.38908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882898.38920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882898.38932: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882898.38946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882898.39024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882898.39041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882898.39055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882898.39315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882898.40994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882898.41066: stderr chunk (state=3): >>><<< 33932 1726882898.41080: stdout chunk (state=3): >>><<< 33932 1726882898.41374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882898.41377: handler run complete 33932 1726882898.41379: Evaluated conditional (False): False 33932 1726882898.41381: attempt loop complete, returning result 33932 1726882898.41383: _execute() done 33932 1726882898.41385: dumping result to json 33932 1726882898.41387: done dumping result, returning 33932 1726882898.41388: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-615b-5c48-0000000007f3] 33932 1726882898.41390: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f3 33932 1726882898.41458: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f3 33932 1726882898.41461: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "delta": "0:00:00.020707", "end": "2024-09-20 21:41:38.364204", "rc": 0, "start": "2024-09-20 21:41:38.343497" } STDOUT: lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 33932 1726882898.41547: no more pending results, returning what we have 33932 1726882898.41551: results queue empty 33932 1726882898.41552: checking for any_errors_fatal 33932 1726882898.41560: done checking for any_errors_fatal 33932 1726882898.41561: checking for max_fail_percentage 33932 1726882898.41573: done checking for max_fail_percentage 33932 1726882898.41575: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.41576: done checking to see if all hosts have failed 33932 1726882898.41577: getting the remaining hosts for this loop 33932 1726882898.41580: done getting the remaining hosts for this loop 33932 1726882898.41584: getting the next task for host managed_node1 33932 1726882898.41591: done getting next task for host managed_node1 33932 1726882898.41594: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 33932 1726882898.41598: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.41603: getting variables 33932 1726882898.41605: in VariableManager get_vars() 33932 1726882898.41652: Calling all_inventory to load vars for managed_node1 33932 1726882898.41656: Calling groups_inventory to load vars for managed_node1 33932 1726882898.41659: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.41680: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.41684: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.41688: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.43606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.45590: done with get_vars() 33932 1726882898.45612: done getting variables 33932 1726882898.45677: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:41:38 -0400 (0:00:00.392) 0:00:18.924 ****** 33932 1726882898.45710: entering _queue_task() for managed_node1/set_fact 33932 1726882898.46019: worker is 1 (out of 1 available) 33932 1726882898.46031: exiting _queue_task() for managed_node1/set_fact 33932 1726882898.46042: done queuing things up, now waiting for results queue to drain 33932 1726882898.46044: waiting for pending results... 33932 1726882898.46335: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 33932 1726882898.46465: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f4 33932 1726882898.46494: variable 'ansible_search_path' from source: unknown 33932 1726882898.46504: variable 'ansible_search_path' from source: unknown 33932 1726882898.46546: calling self._execute() 33932 1726882898.46661: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.46683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.46697: variable 'omit' from source: magic vars 33932 1726882898.48026: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.48053: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.48209: variable 'nm_profile_exists' from source: set_fact 33932 1726882898.48230: Evaluated conditional (nm_profile_exists.rc == 0): True 33932 1726882898.48241: variable 'omit' from source: magic vars 33932 1726882898.48305: variable 'omit' from source: magic vars 33932 1726882898.48341: variable 'omit' from source: magic vars 33932 1726882898.48398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882898.48435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882898.48459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882898.48493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.48512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.48560: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882898.48577: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.48594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.48711: Set connection var ansible_shell_executable to /bin/sh 33932 1726882898.48725: Set connection var ansible_timeout to 10 33932 1726882898.48736: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882898.48746: Set connection var ansible_pipelining to False 33932 1726882898.48753: Set connection var ansible_connection to ssh 33932 1726882898.48760: Set connection var ansible_shell_type to sh 33932 1726882898.48793: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.48810: variable 'ansible_connection' from source: unknown 33932 1726882898.48818: variable 'ansible_module_compression' from source: unknown 33932 1726882898.48825: variable 'ansible_shell_type' from source: unknown 33932 1726882898.48831: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.48838: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.48845: variable 'ansible_pipelining' from source: unknown 33932 1726882898.48851: variable 'ansible_timeout' from source: unknown 33932 1726882898.48859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.49015: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882898.49038: variable 'omit' from source: magic vars 33932 1726882898.49049: starting attempt loop 33932 1726882898.49056: running the handler 33932 1726882898.49079: handler run complete 33932 1726882898.49093: attempt loop complete, returning result 33932 1726882898.49098: _execute() done 33932 1726882898.49104: dumping result to json 33932 1726882898.49110: done dumping result, returning 33932 1726882898.49120: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-615b-5c48-0000000007f4] 33932 1726882898.49139: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f4 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 33932 1726882898.49301: no more pending results, returning what we have 33932 1726882898.49305: results queue empty 33932 1726882898.49306: checking for any_errors_fatal 33932 1726882898.49313: done checking for any_errors_fatal 33932 1726882898.49314: checking for max_fail_percentage 33932 1726882898.49316: done checking for max_fail_percentage 33932 1726882898.49317: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.49318: done checking to see if all hosts have failed 33932 1726882898.49319: getting the remaining hosts for this loop 33932 1726882898.49321: done getting the remaining hosts for this loop 33932 1726882898.49324: getting the next task for host managed_node1 33932 1726882898.49335: done getting next task for host managed_node1 33932 1726882898.49337: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 33932 1726882898.49342: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.49346: getting variables 33932 1726882898.49348: in VariableManager get_vars() 33932 1726882898.49400: Calling all_inventory to load vars for managed_node1 33932 1726882898.49403: Calling groups_inventory to load vars for managed_node1 33932 1726882898.49406: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.49418: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.49421: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.49424: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.50476: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f4 33932 1726882898.50480: WORKER PROCESS EXITING 33932 1726882898.51271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.52224: done with get_vars() 33932 1726882898.52240: done getting variables 33932 1726882898.52285: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882898.52371: variable 'profile' from source: include params 33932 1726882898.52374: variable 'item' from source: include params 33932 1726882898.52417: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:41:38 -0400 (0:00:00.067) 0:00:18.992 ****** 33932 1726882898.52442: entering _queue_task() for managed_node1/command 33932 1726882898.52642: worker is 1 (out of 1 available) 33932 1726882898.52656: exiting _queue_task() for managed_node1/command 33932 1726882898.52672: done queuing things up, now waiting for results queue to drain 33932 1726882898.52674: waiting for pending results... 33932 1726882898.52857: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101 33932 1726882898.53051: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f6 33932 1726882898.53056: variable 'ansible_search_path' from source: unknown 33932 1726882898.53059: variable 'ansible_search_path' from source: unknown 33932 1726882898.53062: calling self._execute() 33932 1726882898.53484: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.53487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.53490: variable 'omit' from source: magic vars 33932 1726882898.53590: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.53593: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.53641: variable 'profile_stat' from source: set_fact 33932 1726882898.53653: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882898.53657: when evaluation is False, skipping this task 33932 1726882898.53660: _execute() done 33932 1726882898.53663: dumping result to json 33932 1726882898.53670: done dumping result, returning 33932 1726882898.53677: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101 [0e448fcc-3ce9-615b-5c48-0000000007f6] 33932 1726882898.53692: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f6 33932 1726882898.53772: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f6 33932 1726882898.53775: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882898.53836: no more pending results, returning what we have 33932 1726882898.53840: results queue empty 33932 1726882898.53841: checking for any_errors_fatal 33932 1726882898.53848: done checking for any_errors_fatal 33932 1726882898.53849: checking for max_fail_percentage 33932 1726882898.53850: done checking for max_fail_percentage 33932 1726882898.53851: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.53852: done checking to see if all hosts have failed 33932 1726882898.53853: getting the remaining hosts for this loop 33932 1726882898.53855: done getting the remaining hosts for this loop 33932 1726882898.53858: getting the next task for host managed_node1 33932 1726882898.53867: done getting next task for host managed_node1 33932 1726882898.53872: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 33932 1726882898.53877: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.53882: getting variables 33932 1726882898.53883: in VariableManager get_vars() 33932 1726882898.53924: Calling all_inventory to load vars for managed_node1 33932 1726882898.53927: Calling groups_inventory to load vars for managed_node1 33932 1726882898.53930: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.53943: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.53946: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.53948: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.56160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.57914: done with get_vars() 33932 1726882898.57942: done getting variables 33932 1726882898.58013: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882898.58561: variable 'profile' from source: include params 33932 1726882898.58567: variable 'item' from source: include params 33932 1726882898.58633: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101] ********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:41:38 -0400 (0:00:00.062) 0:00:19.054 ****** 33932 1726882898.58670: entering _queue_task() for managed_node1/set_fact 33932 1726882898.58962: worker is 1 (out of 1 available) 33932 1726882898.58981: exiting _queue_task() for managed_node1/set_fact 33932 1726882898.58993: done queuing things up, now waiting for results queue to drain 33932 1726882898.58995: waiting for pending results... 33932 1726882898.59314: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101 33932 1726882898.59415: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f7 33932 1726882898.59429: variable 'ansible_search_path' from source: unknown 33932 1726882898.59433: variable 'ansible_search_path' from source: unknown 33932 1726882898.59483: calling self._execute() 33932 1726882898.59589: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.59593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.59602: variable 'omit' from source: magic vars 33932 1726882898.59978: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.59999: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.60127: variable 'profile_stat' from source: set_fact 33932 1726882898.60142: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882898.60146: when evaluation is False, skipping this task 33932 1726882898.60148: _execute() done 33932 1726882898.60151: dumping result to json 33932 1726882898.60153: done dumping result, returning 33932 1726882898.60156: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101 [0e448fcc-3ce9-615b-5c48-0000000007f7] 33932 1726882898.60164: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f7 33932 1726882898.60256: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f7 33932 1726882898.60260: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882898.60313: no more pending results, returning what we have 33932 1726882898.60319: results queue empty 33932 1726882898.60320: checking for any_errors_fatal 33932 1726882898.60329: done checking for any_errors_fatal 33932 1726882898.60330: checking for max_fail_percentage 33932 1726882898.60331: done checking for max_fail_percentage 33932 1726882898.60332: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.60333: done checking to see if all hosts have failed 33932 1726882898.60334: getting the remaining hosts for this loop 33932 1726882898.60336: done getting the remaining hosts for this loop 33932 1726882898.60340: getting the next task for host managed_node1 33932 1726882898.60347: done getting next task for host managed_node1 33932 1726882898.60350: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 33932 1726882898.60355: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.60360: getting variables 33932 1726882898.60362: in VariableManager get_vars() 33932 1726882898.60411: Calling all_inventory to load vars for managed_node1 33932 1726882898.60415: Calling groups_inventory to load vars for managed_node1 33932 1726882898.60418: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.60432: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.60435: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.60439: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.62314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.63283: done with get_vars() 33932 1726882898.63298: done getting variables 33932 1726882898.63339: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882898.63413: variable 'profile' from source: include params 33932 1726882898.63416: variable 'item' from source: include params 33932 1726882898.63456: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:41:38 -0400 (0:00:00.048) 0:00:19.102 ****** 33932 1726882898.63482: entering _queue_task() for managed_node1/command 33932 1726882898.63660: worker is 1 (out of 1 available) 33932 1726882898.63676: exiting _queue_task() for managed_node1/command 33932 1726882898.63689: done queuing things up, now waiting for results queue to drain 33932 1726882898.63691: waiting for pending results... 33932 1726882898.63867: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101 33932 1726882898.63998: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f8 33932 1726882898.64041: variable 'ansible_search_path' from source: unknown 33932 1726882898.64045: variable 'ansible_search_path' from source: unknown 33932 1726882898.64172: calling self._execute() 33932 1726882898.64176: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.64180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.64184: variable 'omit' from source: magic vars 33932 1726882898.65174: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.65185: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.65366: variable 'profile_stat' from source: set_fact 33932 1726882898.65403: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882898.65415: when evaluation is False, skipping this task 33932 1726882898.65422: _execute() done 33932 1726882898.65458: dumping result to json 33932 1726882898.65472: done dumping result, returning 33932 1726882898.65483: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101 [0e448fcc-3ce9-615b-5c48-0000000007f8] 33932 1726882898.65506: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f8 33932 1726882898.65608: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f8 33932 1726882898.65611: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882898.65658: no more pending results, returning what we have 33932 1726882898.65677: results queue empty 33932 1726882898.65679: checking for any_errors_fatal 33932 1726882898.65685: done checking for any_errors_fatal 33932 1726882898.65685: checking for max_fail_percentage 33932 1726882898.65687: done checking for max_fail_percentage 33932 1726882898.65688: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.65689: done checking to see if all hosts have failed 33932 1726882898.65689: getting the remaining hosts for this loop 33932 1726882898.65691: done getting the remaining hosts for this loop 33932 1726882898.65695: getting the next task for host managed_node1 33932 1726882898.65701: done getting next task for host managed_node1 33932 1726882898.65703: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 33932 1726882898.65707: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.65710: getting variables 33932 1726882898.65712: in VariableManager get_vars() 33932 1726882898.65746: Calling all_inventory to load vars for managed_node1 33932 1726882898.65748: Calling groups_inventory to load vars for managed_node1 33932 1726882898.65753: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.65766: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.65773: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.65778: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.67095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.68190: done with get_vars() 33932 1726882898.68205: done getting variables 33932 1726882898.68253: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882898.68399: variable 'profile' from source: include params 33932 1726882898.68403: variable 'item' from source: include params 33932 1726882898.68491: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:41:38 -0400 (0:00:00.050) 0:00:19.152 ****** 33932 1726882898.68520: entering _queue_task() for managed_node1/set_fact 33932 1726882898.68794: worker is 1 (out of 1 available) 33932 1726882898.68807: exiting _queue_task() for managed_node1/set_fact 33932 1726882898.68818: done queuing things up, now waiting for results queue to drain 33932 1726882898.68819: waiting for pending results... 33932 1726882898.69333: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101 33932 1726882898.69450: in run() - task 0e448fcc-3ce9-615b-5c48-0000000007f9 33932 1726882898.69474: variable 'ansible_search_path' from source: unknown 33932 1726882898.69479: variable 'ansible_search_path' from source: unknown 33932 1726882898.69520: calling self._execute() 33932 1726882898.69626: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.69630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.69641: variable 'omit' from source: magic vars 33932 1726882898.70030: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.70049: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.70190: variable 'profile_stat' from source: set_fact 33932 1726882898.70203: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882898.70207: when evaluation is False, skipping this task 33932 1726882898.70211: _execute() done 33932 1726882898.70214: dumping result to json 33932 1726882898.70216: done dumping result, returning 33932 1726882898.70227: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101 [0e448fcc-3ce9-615b-5c48-0000000007f9] 33932 1726882898.70233: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f9 33932 1726882898.70320: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000007f9 33932 1726882898.70323: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882898.70375: no more pending results, returning what we have 33932 1726882898.70379: results queue empty 33932 1726882898.70381: checking for any_errors_fatal 33932 1726882898.70388: done checking for any_errors_fatal 33932 1726882898.70388: checking for max_fail_percentage 33932 1726882898.70390: done checking for max_fail_percentage 33932 1726882898.70391: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.70392: done checking to see if all hosts have failed 33932 1726882898.70393: getting the remaining hosts for this loop 33932 1726882898.70394: done getting the remaining hosts for this loop 33932 1726882898.70398: getting the next task for host managed_node1 33932 1726882898.70407: done getting next task for host managed_node1 33932 1726882898.70410: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 33932 1726882898.70413: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.70419: getting variables 33932 1726882898.70421: in VariableManager get_vars() 33932 1726882898.70465: Calling all_inventory to load vars for managed_node1 33932 1726882898.70468: Calling groups_inventory to load vars for managed_node1 33932 1726882898.70471: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.70484: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.70487: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.70490: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.72324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.74170: done with get_vars() 33932 1726882898.74192: done getting variables 33932 1726882898.74248: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882898.74367: variable 'profile' from source: include params 33932 1726882898.74370: variable 'item' from source: include params 33932 1726882898.74433: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:41:38 -0400 (0:00:00.059) 0:00:19.212 ****** 33932 1726882898.74461: entering _queue_task() for managed_node1/assert 33932 1726882898.74726: worker is 1 (out of 1 available) 33932 1726882898.74738: exiting _queue_task() for managed_node1/assert 33932 1726882898.74750: done queuing things up, now waiting for results queue to drain 33932 1726882898.74751: waiting for pending results... 33932 1726882898.75047: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101' 33932 1726882898.75145: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006b9 33932 1726882898.75159: variable 'ansible_search_path' from source: unknown 33932 1726882898.75163: variable 'ansible_search_path' from source: unknown 33932 1726882898.75203: calling self._execute() 33932 1726882898.75308: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.75313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.75322: variable 'omit' from source: magic vars 33932 1726882898.75706: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.75717: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.75723: variable 'omit' from source: magic vars 33932 1726882898.75762: variable 'omit' from source: magic vars 33932 1726882898.75873: variable 'profile' from source: include params 33932 1726882898.75877: variable 'item' from source: include params 33932 1726882898.75951: variable 'item' from source: include params 33932 1726882898.75976: variable 'omit' from source: magic vars 33932 1726882898.76027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882898.76060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882898.76084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882898.76106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.76126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.76153: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882898.76156: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.76159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.76270: Set connection var ansible_shell_executable to /bin/sh 33932 1726882898.76276: Set connection var ansible_timeout to 10 33932 1726882898.76288: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882898.76294: Set connection var ansible_pipelining to False 33932 1726882898.76297: Set connection var ansible_connection to ssh 33932 1726882898.76299: Set connection var ansible_shell_type to sh 33932 1726882898.76331: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.76337: variable 'ansible_connection' from source: unknown 33932 1726882898.76340: variable 'ansible_module_compression' from source: unknown 33932 1726882898.76342: variable 'ansible_shell_type' from source: unknown 33932 1726882898.76344: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.76348: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.76351: variable 'ansible_pipelining' from source: unknown 33932 1726882898.76354: variable 'ansible_timeout' from source: unknown 33932 1726882898.76359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.76507: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882898.76518: variable 'omit' from source: magic vars 33932 1726882898.76523: starting attempt loop 33932 1726882898.76527: running the handler 33932 1726882898.76654: variable 'lsr_net_profile_exists' from source: set_fact 33932 1726882898.76660: Evaluated conditional (lsr_net_profile_exists): True 33932 1726882898.76672: handler run complete 33932 1726882898.76687: attempt loop complete, returning result 33932 1726882898.76690: _execute() done 33932 1726882898.76693: dumping result to json 33932 1726882898.76696: done dumping result, returning 33932 1726882898.76702: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101' [0e448fcc-3ce9-615b-5c48-0000000006b9] 33932 1726882898.76707: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006b9 33932 1726882898.76796: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006b9 33932 1726882898.76799: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882898.76878: no more pending results, returning what we have 33932 1726882898.76881: results queue empty 33932 1726882898.76882: checking for any_errors_fatal 33932 1726882898.76888: done checking for any_errors_fatal 33932 1726882898.76889: checking for max_fail_percentage 33932 1726882898.76891: done checking for max_fail_percentage 33932 1726882898.76892: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.76893: done checking to see if all hosts have failed 33932 1726882898.76894: getting the remaining hosts for this loop 33932 1726882898.76896: done getting the remaining hosts for this loop 33932 1726882898.76899: getting the next task for host managed_node1 33932 1726882898.76906: done getting next task for host managed_node1 33932 1726882898.76908: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 33932 1726882898.76912: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.76916: getting variables 33932 1726882898.76918: in VariableManager get_vars() 33932 1726882898.76957: Calling all_inventory to load vars for managed_node1 33932 1726882898.76961: Calling groups_inventory to load vars for managed_node1 33932 1726882898.76966: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.76978: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.76982: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.76985: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.82679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.84531: done with get_vars() 33932 1726882898.84554: done getting variables 33932 1726882898.84613: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882898.84716: variable 'profile' from source: include params 33932 1726882898.84719: variable 'item' from source: include params 33932 1726882898.84778: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101'] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:41:38 -0400 (0:00:00.103) 0:00:19.315 ****** 33932 1726882898.84818: entering _queue_task() for managed_node1/assert 33932 1726882898.85149: worker is 1 (out of 1 available) 33932 1726882898.85163: exiting _queue_task() for managed_node1/assert 33932 1726882898.85176: done queuing things up, now waiting for results queue to drain 33932 1726882898.85178: waiting for pending results... 33932 1726882898.85628: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101' 33932 1726882898.85678: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006ba 33932 1726882898.85687: variable 'ansible_search_path' from source: unknown 33932 1726882898.85690: variable 'ansible_search_path' from source: unknown 33932 1726882898.85732: calling self._execute() 33932 1726882898.85841: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.85847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.85861: variable 'omit' from source: magic vars 33932 1726882898.86309: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.86326: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.86332: variable 'omit' from source: magic vars 33932 1726882898.86379: variable 'omit' from source: magic vars 33932 1726882898.86490: variable 'profile' from source: include params 33932 1726882898.86494: variable 'item' from source: include params 33932 1726882898.86567: variable 'item' from source: include params 33932 1726882898.86593: variable 'omit' from source: magic vars 33932 1726882898.86636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882898.86676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882898.86705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882898.86726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.86736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.86773: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882898.86777: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.86780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.86893: Set connection var ansible_shell_executable to /bin/sh 33932 1726882898.86901: Set connection var ansible_timeout to 10 33932 1726882898.86911: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882898.86921: Set connection var ansible_pipelining to False 33932 1726882898.86924: Set connection var ansible_connection to ssh 33932 1726882898.86926: Set connection var ansible_shell_type to sh 33932 1726882898.86952: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.86955: variable 'ansible_connection' from source: unknown 33932 1726882898.86957: variable 'ansible_module_compression' from source: unknown 33932 1726882898.86960: variable 'ansible_shell_type' from source: unknown 33932 1726882898.86967: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.86973: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.86976: variable 'ansible_pipelining' from source: unknown 33932 1726882898.86979: variable 'ansible_timeout' from source: unknown 33932 1726882898.86981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.87121: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882898.87140: variable 'omit' from source: magic vars 33932 1726882898.87145: starting attempt loop 33932 1726882898.87148: running the handler 33932 1726882898.87271: variable 'lsr_net_profile_ansible_managed' from source: set_fact 33932 1726882898.87274: Evaluated conditional (lsr_net_profile_ansible_managed): True 33932 1726882898.87281: handler run complete 33932 1726882898.87300: attempt loop complete, returning result 33932 1726882898.87303: _execute() done 33932 1726882898.87306: dumping result to json 33932 1726882898.87308: done dumping result, returning 33932 1726882898.87316: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101' [0e448fcc-3ce9-615b-5c48-0000000006ba] 33932 1726882898.87321: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006ba ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882898.87466: no more pending results, returning what we have 33932 1726882898.87470: results queue empty 33932 1726882898.87471: checking for any_errors_fatal 33932 1726882898.87480: done checking for any_errors_fatal 33932 1726882898.87481: checking for max_fail_percentage 33932 1726882898.87483: done checking for max_fail_percentage 33932 1726882898.87484: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.87484: done checking to see if all hosts have failed 33932 1726882898.87485: getting the remaining hosts for this loop 33932 1726882898.87487: done getting the remaining hosts for this loop 33932 1726882898.87491: getting the next task for host managed_node1 33932 1726882898.87497: done getting next task for host managed_node1 33932 1726882898.87501: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 33932 1726882898.87505: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.87510: getting variables 33932 1726882898.87513: in VariableManager get_vars() 33932 1726882898.87555: Calling all_inventory to load vars for managed_node1 33932 1726882898.87559: Calling groups_inventory to load vars for managed_node1 33932 1726882898.87561: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.87576: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.87579: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.87583: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.88113: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006ba 33932 1726882898.88627: WORKER PROCESS EXITING 33932 1726882898.89451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.91739: done with get_vars() 33932 1726882898.91762: done getting variables 33932 1726882898.91816: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882898.91922: variable 'profile' from source: include params 33932 1726882898.91926: variable 'item' from source: include params 33932 1726882898.91985: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:41:38 -0400 (0:00:00.072) 0:00:19.387 ****** 33932 1726882898.92020: entering _queue_task() for managed_node1/assert 33932 1726882898.92280: worker is 1 (out of 1 available) 33932 1726882898.92293: exiting _queue_task() for managed_node1/assert 33932 1726882898.92305: done queuing things up, now waiting for results queue to drain 33932 1726882898.92306: waiting for pending results... 33932 1726882898.92577: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101 33932 1726882898.92689: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006bb 33932 1726882898.92709: variable 'ansible_search_path' from source: unknown 33932 1726882898.92717: variable 'ansible_search_path' from source: unknown 33932 1726882898.92760: calling self._execute() 33932 1726882898.92864: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.92881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.92895: variable 'omit' from source: magic vars 33932 1726882898.93254: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.93276: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.93290: variable 'omit' from source: magic vars 33932 1726882898.93333: variable 'omit' from source: magic vars 33932 1726882898.93440: variable 'profile' from source: include params 33932 1726882898.93449: variable 'item' from source: include params 33932 1726882898.93522: variable 'item' from source: include params 33932 1726882898.93544: variable 'omit' from source: magic vars 33932 1726882898.93592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882898.93632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882898.93654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882898.93680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.93696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882898.93731: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882898.93740: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.93747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.93857: Set connection var ansible_shell_executable to /bin/sh 33932 1726882898.93875: Set connection var ansible_timeout to 10 33932 1726882898.93885: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882898.93894: Set connection var ansible_pipelining to False 33932 1726882898.93899: Set connection var ansible_connection to ssh 33932 1726882898.93905: Set connection var ansible_shell_type to sh 33932 1726882898.93930: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.93941: variable 'ansible_connection' from source: unknown 33932 1726882898.93947: variable 'ansible_module_compression' from source: unknown 33932 1726882898.93954: variable 'ansible_shell_type' from source: unknown 33932 1726882898.93960: variable 'ansible_shell_executable' from source: unknown 33932 1726882898.93971: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.93980: variable 'ansible_pipelining' from source: unknown 33932 1726882898.93986: variable 'ansible_timeout' from source: unknown 33932 1726882898.93993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.94128: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882898.94143: variable 'omit' from source: magic vars 33932 1726882898.94156: starting attempt loop 33932 1726882898.94166: running the handler 33932 1726882898.94277: variable 'lsr_net_profile_fingerprint' from source: set_fact 33932 1726882898.94287: Evaluated conditional (lsr_net_profile_fingerprint): True 33932 1726882898.94296: handler run complete 33932 1726882898.94313: attempt loop complete, returning result 33932 1726882898.94319: _execute() done 33932 1726882898.94325: dumping result to json 33932 1726882898.94331: done dumping result, returning 33932 1726882898.94341: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101 [0e448fcc-3ce9-615b-5c48-0000000006bb] 33932 1726882898.94348: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006bb 33932 1726882898.94448: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006bb 33932 1726882898.94456: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882898.94520: no more pending results, returning what we have 33932 1726882898.94523: results queue empty 33932 1726882898.94524: checking for any_errors_fatal 33932 1726882898.94533: done checking for any_errors_fatal 33932 1726882898.94534: checking for max_fail_percentage 33932 1726882898.94536: done checking for max_fail_percentage 33932 1726882898.94537: checking to see if all hosts have failed and the running result is not ok 33932 1726882898.94538: done checking to see if all hosts have failed 33932 1726882898.94538: getting the remaining hosts for this loop 33932 1726882898.94540: done getting the remaining hosts for this loop 33932 1726882898.94544: getting the next task for host managed_node1 33932 1726882898.94553: done getting next task for host managed_node1 33932 1726882898.94556: ^ task is: TASK: Include the task 'get_profile_stat.yml' 33932 1726882898.94559: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882898.94566: getting variables 33932 1726882898.94571: in VariableManager get_vars() 33932 1726882898.94612: Calling all_inventory to load vars for managed_node1 33932 1726882898.94615: Calling groups_inventory to load vars for managed_node1 33932 1726882898.94618: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.94629: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.94633: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.94636: Calling groups_plugins_play to load vars for managed_node1 33932 1726882898.96413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882898.98198: done with get_vars() 33932 1726882898.98223: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:41:38 -0400 (0:00:00.062) 0:00:19.450 ****** 33932 1726882898.98319: entering _queue_task() for managed_node1/include_tasks 33932 1726882898.98608: worker is 1 (out of 1 available) 33932 1726882898.98622: exiting _queue_task() for managed_node1/include_tasks 33932 1726882898.98635: done queuing things up, now waiting for results queue to drain 33932 1726882898.98637: waiting for pending results... 33932 1726882898.98928: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 33932 1726882898.99049: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006bf 33932 1726882898.99076: variable 'ansible_search_path' from source: unknown 33932 1726882898.99087: variable 'ansible_search_path' from source: unknown 33932 1726882898.99128: calling self._execute() 33932 1726882898.99235: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882898.99245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882898.99259: variable 'omit' from source: magic vars 33932 1726882898.99632: variable 'ansible_distribution_major_version' from source: facts 33932 1726882898.99651: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882898.99661: _execute() done 33932 1726882898.99676: dumping result to json 33932 1726882898.99683: done dumping result, returning 33932 1726882898.99693: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-615b-5c48-0000000006bf] 33932 1726882898.99703: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006bf 33932 1726882898.99810: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006bf 33932 1726882898.99819: WORKER PROCESS EXITING 33932 1726882898.99864: no more pending results, returning what we have 33932 1726882898.99873: in VariableManager get_vars() 33932 1726882898.99925: Calling all_inventory to load vars for managed_node1 33932 1726882898.99927: Calling groups_inventory to load vars for managed_node1 33932 1726882898.99930: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882898.99943: Calling all_plugins_play to load vars for managed_node1 33932 1726882898.99946: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882898.99949: Calling groups_plugins_play to load vars for managed_node1 33932 1726882899.01547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882899.03291: done with get_vars() 33932 1726882899.03309: variable 'ansible_search_path' from source: unknown 33932 1726882899.03310: variable 'ansible_search_path' from source: unknown 33932 1726882899.03347: we have included files to process 33932 1726882899.03348: generating all_blocks data 33932 1726882899.03350: done generating all_blocks data 33932 1726882899.03355: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 33932 1726882899.03357: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 33932 1726882899.03359: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 33932 1726882899.04228: done processing included file 33932 1726882899.04230: iterating over new_blocks loaded from include file 33932 1726882899.04232: in VariableManager get_vars() 33932 1726882899.04251: done with get_vars() 33932 1726882899.04253: filtering new block on tags 33932 1726882899.04282: done filtering new block on tags 33932 1726882899.04285: in VariableManager get_vars() 33932 1726882899.04302: done with get_vars() 33932 1726882899.04304: filtering new block on tags 33932 1726882899.04324: done filtering new block on tags 33932 1726882899.04326: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 33932 1726882899.04331: extending task lists for all hosts with included blocks 33932 1726882899.04510: done extending task lists 33932 1726882899.04511: done processing included files 33932 1726882899.04512: results queue empty 33932 1726882899.04513: checking for any_errors_fatal 33932 1726882899.04516: done checking for any_errors_fatal 33932 1726882899.04517: checking for max_fail_percentage 33932 1726882899.04518: done checking for max_fail_percentage 33932 1726882899.04519: checking to see if all hosts have failed and the running result is not ok 33932 1726882899.04519: done checking to see if all hosts have failed 33932 1726882899.04520: getting the remaining hosts for this loop 33932 1726882899.04522: done getting the remaining hosts for this loop 33932 1726882899.04524: getting the next task for host managed_node1 33932 1726882899.04528: done getting next task for host managed_node1 33932 1726882899.04530: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 33932 1726882899.04533: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882899.04535: getting variables 33932 1726882899.04536: in VariableManager get_vars() 33932 1726882899.04549: Calling all_inventory to load vars for managed_node1 33932 1726882899.04551: Calling groups_inventory to load vars for managed_node1 33932 1726882899.04553: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882899.04558: Calling all_plugins_play to load vars for managed_node1 33932 1726882899.04561: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882899.04565: Calling groups_plugins_play to load vars for managed_node1 33932 1726882899.05862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882899.07579: done with get_vars() 33932 1726882899.07603: done getting variables 33932 1726882899.07644: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:41:39 -0400 (0:00:00.093) 0:00:19.544 ****** 33932 1726882899.07679: entering _queue_task() for managed_node1/set_fact 33932 1726882899.08005: worker is 1 (out of 1 available) 33932 1726882899.08018: exiting _queue_task() for managed_node1/set_fact 33932 1726882899.08030: done queuing things up, now waiting for results queue to drain 33932 1726882899.08032: waiting for pending results... 33932 1726882899.08319: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 33932 1726882899.08433: in run() - task 0e448fcc-3ce9-615b-5c48-000000000838 33932 1726882899.08454: variable 'ansible_search_path' from source: unknown 33932 1726882899.08462: variable 'ansible_search_path' from source: unknown 33932 1726882899.08512: calling self._execute() 33932 1726882899.08620: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.08630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.08644: variable 'omit' from source: magic vars 33932 1726882899.09037: variable 'ansible_distribution_major_version' from source: facts 33932 1726882899.09058: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882899.09074: variable 'omit' from source: magic vars 33932 1726882899.09128: variable 'omit' from source: magic vars 33932 1726882899.09173: variable 'omit' from source: magic vars 33932 1726882899.09218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882899.09266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882899.09295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882899.09318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882899.09336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882899.09373: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882899.09383: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.09390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.09503: Set connection var ansible_shell_executable to /bin/sh 33932 1726882899.09521: Set connection var ansible_timeout to 10 33932 1726882899.09544: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882899.09557: Set connection var ansible_pipelining to False 33932 1726882899.09572: Set connection var ansible_connection to ssh 33932 1726882899.09580: Set connection var ansible_shell_type to sh 33932 1726882899.09611: variable 'ansible_shell_executable' from source: unknown 33932 1726882899.09619: variable 'ansible_connection' from source: unknown 33932 1726882899.09626: variable 'ansible_module_compression' from source: unknown 33932 1726882899.09631: variable 'ansible_shell_type' from source: unknown 33932 1726882899.09638: variable 'ansible_shell_executable' from source: unknown 33932 1726882899.09643: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.09650: variable 'ansible_pipelining' from source: unknown 33932 1726882899.09655: variable 'ansible_timeout' from source: unknown 33932 1726882899.09662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.09812: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882899.09830: variable 'omit' from source: magic vars 33932 1726882899.09840: starting attempt loop 33932 1726882899.09847: running the handler 33932 1726882899.09865: handler run complete 33932 1726882899.09884: attempt loop complete, returning result 33932 1726882899.09892: _execute() done 33932 1726882899.09898: dumping result to json 33932 1726882899.09905: done dumping result, returning 33932 1726882899.09917: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-615b-5c48-000000000838] 33932 1726882899.09927: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000838 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 33932 1726882899.10067: no more pending results, returning what we have 33932 1726882899.10073: results queue empty 33932 1726882899.10074: checking for any_errors_fatal 33932 1726882899.10076: done checking for any_errors_fatal 33932 1726882899.10076: checking for max_fail_percentage 33932 1726882899.10078: done checking for max_fail_percentage 33932 1726882899.10079: checking to see if all hosts have failed and the running result is not ok 33932 1726882899.10080: done checking to see if all hosts have failed 33932 1726882899.10081: getting the remaining hosts for this loop 33932 1726882899.10083: done getting the remaining hosts for this loop 33932 1726882899.10087: getting the next task for host managed_node1 33932 1726882899.10094: done getting next task for host managed_node1 33932 1726882899.10096: ^ task is: TASK: Stat profile file 33932 1726882899.10101: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882899.10105: getting variables 33932 1726882899.10107: in VariableManager get_vars() 33932 1726882899.10146: Calling all_inventory to load vars for managed_node1 33932 1726882899.10149: Calling groups_inventory to load vars for managed_node1 33932 1726882899.10151: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882899.10166: Calling all_plugins_play to load vars for managed_node1 33932 1726882899.10171: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882899.10174: Calling groups_plugins_play to load vars for managed_node1 33932 1726882899.11401: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000838 33932 1726882899.11405: WORKER PROCESS EXITING 33932 1726882899.12738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882899.14633: done with get_vars() 33932 1726882899.14653: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:41:39 -0400 (0:00:00.070) 0:00:19.615 ****** 33932 1726882899.14741: entering _queue_task() for managed_node1/stat 33932 1726882899.15726: worker is 1 (out of 1 available) 33932 1726882899.15740: exiting _queue_task() for managed_node1/stat 33932 1726882899.15753: done queuing things up, now waiting for results queue to drain 33932 1726882899.15755: waiting for pending results... 33932 1726882899.16546: running TaskExecutor() for managed_node1/TASK: Stat profile file 33932 1726882899.16720: in run() - task 0e448fcc-3ce9-615b-5c48-000000000839 33932 1726882899.16791: variable 'ansible_search_path' from source: unknown 33932 1726882899.16799: variable 'ansible_search_path' from source: unknown 33932 1726882899.16838: calling self._execute() 33932 1726882899.17058: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.17182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.17199: variable 'omit' from source: magic vars 33932 1726882899.18053: variable 'ansible_distribution_major_version' from source: facts 33932 1726882899.18077: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882899.18090: variable 'omit' from source: magic vars 33932 1726882899.18146: variable 'omit' from source: magic vars 33932 1726882899.18257: variable 'profile' from source: include params 33932 1726882899.18490: variable 'item' from source: include params 33932 1726882899.18559: variable 'item' from source: include params 33932 1726882899.18793: variable 'omit' from source: magic vars 33932 1726882899.18846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882899.19062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882899.19156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882899.19184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882899.19202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882899.19285: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882899.19378: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.19478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.19797: Set connection var ansible_shell_executable to /bin/sh 33932 1726882899.19810: Set connection var ansible_timeout to 10 33932 1726882899.19825: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882899.19834: Set connection var ansible_pipelining to False 33932 1726882899.19840: Set connection var ansible_connection to ssh 33932 1726882899.19846: Set connection var ansible_shell_type to sh 33932 1726882899.19908: variable 'ansible_shell_executable' from source: unknown 33932 1726882899.19917: variable 'ansible_connection' from source: unknown 33932 1726882899.19925: variable 'ansible_module_compression' from source: unknown 33932 1726882899.19933: variable 'ansible_shell_type' from source: unknown 33932 1726882899.20012: variable 'ansible_shell_executable' from source: unknown 33932 1726882899.20020: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.20027: variable 'ansible_pipelining' from source: unknown 33932 1726882899.20034: variable 'ansible_timeout' from source: unknown 33932 1726882899.20041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.20411: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882899.20474: variable 'omit' from source: magic vars 33932 1726882899.20485: starting attempt loop 33932 1726882899.20492: running the handler 33932 1726882899.20509: _low_level_execute_command(): starting 33932 1726882899.20558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882899.23660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.23689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.23693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.23989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.24095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.24197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.25977: stdout chunk (state=3): >>>/root <<< 33932 1726882899.25981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.26374: stderr chunk (state=3): >>><<< 33932 1726882899.26377: stdout chunk (state=3): >>><<< 33932 1726882899.26384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882899.26387: _low_level_execute_command(): starting 33932 1726882899.26389: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314 `" && echo ansible-tmp-1726882899.260552-34837-187180642745314="` echo /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314 `" ) && sleep 0' 33932 1726882899.27918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.27922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.27949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882899.27952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.27961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.28018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.28193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.28198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.28305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.30188: stdout chunk (state=3): >>>ansible-tmp-1726882899.260552-34837-187180642745314=/root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314 <<< 33932 1726882899.30305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.30368: stderr chunk (state=3): >>><<< 33932 1726882899.30372: stdout chunk (state=3): >>><<< 33932 1726882899.30675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882899.260552-34837-187180642745314=/root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882899.30679: variable 'ansible_module_compression' from source: unknown 33932 1726882899.30681: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 33932 1726882899.30684: variable 'ansible_facts' from source: unknown 33932 1726882899.30686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314/AnsiballZ_stat.py 33932 1726882899.30795: Sending initial data 33932 1726882899.30798: Sent initial data (152 bytes) 33932 1726882899.31787: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882899.31802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.31817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.31832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.31881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.31901: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882899.31916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.31935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882899.31947: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882899.31958: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882899.31975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.31988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.32012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.32023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.32032: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882899.32044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.32165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.32197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.32221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.32350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.34101: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882899.34185: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882899.34295: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpesmwmqzu /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314/AnsiballZ_stat.py <<< 33932 1726882899.34379: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882899.35790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.35972: stderr chunk (state=3): >>><<< 33932 1726882899.35976: stdout chunk (state=3): >>><<< 33932 1726882899.35978: done transferring module to remote 33932 1726882899.35980: _low_level_execute_command(): starting 33932 1726882899.35983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314/ /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314/AnsiballZ_stat.py && sleep 0' 33932 1726882899.36594: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882899.36607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.36627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.36660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.36705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.36718: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882899.36748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.37482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882899.37497: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882899.37511: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882899.37523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.37539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.37553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.37643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.37659: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882899.37676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.37762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.37790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.37815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.37934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.39748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.39752: stdout chunk (state=3): >>><<< 33932 1726882899.39754: stderr chunk (state=3): >>><<< 33932 1726882899.39834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882899.39838: _low_level_execute_command(): starting 33932 1726882899.39841: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314/AnsiballZ_stat.py && sleep 0' 33932 1726882899.40337: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882899.40350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.40363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.40383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.40422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.40433: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882899.40445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.40460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882899.40475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882899.40487: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882899.40498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.40511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.40527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.40539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.40550: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882899.40562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.40638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.40654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.40674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.40805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.53941: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 33932 1726882899.54962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882899.55016: stderr chunk (state=3): >>><<< 33932 1726882899.55020: stdout chunk (state=3): >>><<< 33932 1726882899.55036: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882899.55059: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882899.55071: _low_level_execute_command(): starting 33932 1726882899.55075: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882899.260552-34837-187180642745314/ > /dev/null 2>&1 && sleep 0' 33932 1726882899.55507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.55515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.55560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.55566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.55571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.55620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.55635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.55741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.57581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.57617: stderr chunk (state=3): >>><<< 33932 1726882899.57620: stdout chunk (state=3): >>><<< 33932 1726882899.57637: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882899.57643: handler run complete 33932 1726882899.57667: attempt loop complete, returning result 33932 1726882899.57677: _execute() done 33932 1726882899.57680: dumping result to json 33932 1726882899.57682: done dumping result, returning 33932 1726882899.57691: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-615b-5c48-000000000839] 33932 1726882899.57696: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000839 33932 1726882899.57796: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000839 33932 1726882899.57799: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 33932 1726882899.57885: no more pending results, returning what we have 33932 1726882899.57888: results queue empty 33932 1726882899.57889: checking for any_errors_fatal 33932 1726882899.57897: done checking for any_errors_fatal 33932 1726882899.57897: checking for max_fail_percentage 33932 1726882899.57899: done checking for max_fail_percentage 33932 1726882899.57900: checking to see if all hosts have failed and the running result is not ok 33932 1726882899.57901: done checking to see if all hosts have failed 33932 1726882899.57901: getting the remaining hosts for this loop 33932 1726882899.57903: done getting the remaining hosts for this loop 33932 1726882899.57907: getting the next task for host managed_node1 33932 1726882899.57914: done getting next task for host managed_node1 33932 1726882899.57917: ^ task is: TASK: Set NM profile exist flag based on the profile files 33932 1726882899.57920: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882899.57925: getting variables 33932 1726882899.57926: in VariableManager get_vars() 33932 1726882899.57967: Calling all_inventory to load vars for managed_node1 33932 1726882899.57969: Calling groups_inventory to load vars for managed_node1 33932 1726882899.57971: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882899.57982: Calling all_plugins_play to load vars for managed_node1 33932 1726882899.57984: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882899.57987: Calling groups_plugins_play to load vars for managed_node1 33932 1726882899.59060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882899.60120: done with get_vars() 33932 1726882899.60140: done getting variables 33932 1726882899.60185: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:41:39 -0400 (0:00:00.454) 0:00:20.069 ****** 33932 1726882899.60241: entering _queue_task() for managed_node1/set_fact 33932 1726882899.60601: worker is 1 (out of 1 available) 33932 1726882899.60613: exiting _queue_task() for managed_node1/set_fact 33932 1726882899.60625: done queuing things up, now waiting for results queue to drain 33932 1726882899.60627: waiting for pending results... 33932 1726882899.60945: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 33932 1726882899.61159: in run() - task 0e448fcc-3ce9-615b-5c48-00000000083a 33932 1726882899.61177: variable 'ansible_search_path' from source: unknown 33932 1726882899.61181: variable 'ansible_search_path' from source: unknown 33932 1726882899.61255: calling self._execute() 33932 1726882899.61355: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.61359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.61362: variable 'omit' from source: magic vars 33932 1726882899.61665: variable 'ansible_distribution_major_version' from source: facts 33932 1726882899.61681: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882899.61770: variable 'profile_stat' from source: set_fact 33932 1726882899.61781: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882899.61785: when evaluation is False, skipping this task 33932 1726882899.61788: _execute() done 33932 1726882899.61791: dumping result to json 33932 1726882899.61796: done dumping result, returning 33932 1726882899.61798: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-615b-5c48-00000000083a] 33932 1726882899.61806: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083a 33932 1726882899.61890: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083a 33932 1726882899.61893: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882899.61936: no more pending results, returning what we have 33932 1726882899.61940: results queue empty 33932 1726882899.61941: checking for any_errors_fatal 33932 1726882899.61949: done checking for any_errors_fatal 33932 1726882899.61949: checking for max_fail_percentage 33932 1726882899.61951: done checking for max_fail_percentage 33932 1726882899.61952: checking to see if all hosts have failed and the running result is not ok 33932 1726882899.61953: done checking to see if all hosts have failed 33932 1726882899.61953: getting the remaining hosts for this loop 33932 1726882899.61955: done getting the remaining hosts for this loop 33932 1726882899.61958: getting the next task for host managed_node1 33932 1726882899.61966: done getting next task for host managed_node1 33932 1726882899.61971: ^ task is: TASK: Get NM profile info 33932 1726882899.61975: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882899.61979: getting variables 33932 1726882899.61980: in VariableManager get_vars() 33932 1726882899.62014: Calling all_inventory to load vars for managed_node1 33932 1726882899.62016: Calling groups_inventory to load vars for managed_node1 33932 1726882899.62018: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882899.62027: Calling all_plugins_play to load vars for managed_node1 33932 1726882899.62030: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882899.62032: Calling groups_plugins_play to load vars for managed_node1 33932 1726882899.62820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882899.63771: done with get_vars() 33932 1726882899.63787: done getting variables 33932 1726882899.63831: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:41:39 -0400 (0:00:00.036) 0:00:20.106 ****** 33932 1726882899.63853: entering _queue_task() for managed_node1/shell 33932 1726882899.64056: worker is 1 (out of 1 available) 33932 1726882899.64074: exiting _queue_task() for managed_node1/shell 33932 1726882899.64086: done queuing things up, now waiting for results queue to drain 33932 1726882899.64088: waiting for pending results... 33932 1726882899.64254: running TaskExecutor() for managed_node1/TASK: Get NM profile info 33932 1726882899.64329: in run() - task 0e448fcc-3ce9-615b-5c48-00000000083b 33932 1726882899.64340: variable 'ansible_search_path' from source: unknown 33932 1726882899.64346: variable 'ansible_search_path' from source: unknown 33932 1726882899.64379: calling self._execute() 33932 1726882899.64450: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.64453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.64463: variable 'omit' from source: magic vars 33932 1726882899.64730: variable 'ansible_distribution_major_version' from source: facts 33932 1726882899.64742: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882899.64745: variable 'omit' from source: magic vars 33932 1726882899.64779: variable 'omit' from source: magic vars 33932 1726882899.64849: variable 'profile' from source: include params 33932 1726882899.64855: variable 'item' from source: include params 33932 1726882899.64905: variable 'item' from source: include params 33932 1726882899.64920: variable 'omit' from source: magic vars 33932 1726882899.64965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882899.65020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882899.65038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882899.65051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882899.65060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882899.65088: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882899.65091: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.65095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.65167: Set connection var ansible_shell_executable to /bin/sh 33932 1726882899.65173: Set connection var ansible_timeout to 10 33932 1726882899.65179: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882899.65184: Set connection var ansible_pipelining to False 33932 1726882899.65188: Set connection var ansible_connection to ssh 33932 1726882899.65190: Set connection var ansible_shell_type to sh 33932 1726882899.65208: variable 'ansible_shell_executable' from source: unknown 33932 1726882899.65211: variable 'ansible_connection' from source: unknown 33932 1726882899.65213: variable 'ansible_module_compression' from source: unknown 33932 1726882899.65215: variable 'ansible_shell_type' from source: unknown 33932 1726882899.65219: variable 'ansible_shell_executable' from source: unknown 33932 1726882899.65222: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882899.65224: variable 'ansible_pipelining' from source: unknown 33932 1726882899.65227: variable 'ansible_timeout' from source: unknown 33932 1726882899.65229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882899.65326: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882899.65335: variable 'omit' from source: magic vars 33932 1726882899.65341: starting attempt loop 33932 1726882899.65343: running the handler 33932 1726882899.65354: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882899.65371: _low_level_execute_command(): starting 33932 1726882899.65376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882899.65912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.65927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.65938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882899.65952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.65968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.66014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.66027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.66135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.67779: stdout chunk (state=3): >>>/root <<< 33932 1726882899.67886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.67930: stderr chunk (state=3): >>><<< 33932 1726882899.67933: stdout chunk (state=3): >>><<< 33932 1726882899.67954: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882899.67968: _low_level_execute_command(): starting 33932 1726882899.67978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068 `" && echo ansible-tmp-1726882899.6795382-34874-136490132449068="` echo /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068 `" ) && sleep 0' 33932 1726882899.68429: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.68440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.68469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.68478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.68525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.68528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.68632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.70512: stdout chunk (state=3): >>>ansible-tmp-1726882899.6795382-34874-136490132449068=/root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068 <<< 33932 1726882899.70625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.70681: stderr chunk (state=3): >>><<< 33932 1726882899.70684: stdout chunk (state=3): >>><<< 33932 1726882899.70700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882899.6795382-34874-136490132449068=/root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882899.70731: variable 'ansible_module_compression' from source: unknown 33932 1726882899.70780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882899.70806: variable 'ansible_facts' from source: unknown 33932 1726882899.70876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068/AnsiballZ_command.py 33932 1726882899.70990: Sending initial data 33932 1726882899.70993: Sent initial data (156 bytes) 33932 1726882899.71826: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882899.71835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.71845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.71859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.71904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.71913: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882899.71925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.71938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882899.71948: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882899.71953: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882899.71961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.71973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.71984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.71992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.71999: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882899.72015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.72093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.72106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.72114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.72235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.73963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 33932 1726882899.73971: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882899.74053: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882899.74143: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpge9lcysb /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068/AnsiballZ_command.py <<< 33932 1726882899.74229: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882899.75912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.76090: stderr chunk (state=3): >>><<< 33932 1726882899.76093: stdout chunk (state=3): >>><<< 33932 1726882899.76113: done transferring module to remote 33932 1726882899.76125: _low_level_execute_command(): starting 33932 1726882899.76129: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068/ /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068/AnsiballZ_command.py && sleep 0' 33932 1726882899.76921: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.76927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.76970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.76978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.76995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882899.77003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.77073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.77095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.77214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.79014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882899.79018: stdout chunk (state=3): >>><<< 33932 1726882899.79023: stderr chunk (state=3): >>><<< 33932 1726882899.79036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882899.79039: _low_level_execute_command(): starting 33932 1726882899.79044: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068/AnsiballZ_command.py && sleep 0' 33932 1726882899.81009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882899.81018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.81027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.81041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.81085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.81091: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882899.81101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.81114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882899.81121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882899.81129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882899.81135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882899.81144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.81155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.81162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882899.81173: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882899.81180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.81265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.81276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.81291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.81400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882899.96884: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-20 21:41:39.946084", "end": "2024-09-20 21:41:39.967330", "delta": "0:00:00.021246", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882899.98194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882899.98198: stdout chunk (state=3): >>><<< 33932 1726882899.98203: stderr chunk (state=3): >>><<< 33932 1726882899.98221: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-20 21:41:39.946084", "end": "2024-09-20 21:41:39.967330", "delta": "0:00:00.021246", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882899.98261: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882899.98273: _low_level_execute_command(): starting 33932 1726882899.98280: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882899.6795382-34874-136490132449068/ > /dev/null 2>&1 && sleep 0' 33932 1726882899.98974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.98978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882899.99016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882899.99021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.99023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882899.99025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882899.99084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882899.99093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882899.99103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882899.99213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882900.01026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882900.01099: stderr chunk (state=3): >>><<< 33932 1726882900.01109: stdout chunk (state=3): >>><<< 33932 1726882900.01375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882900.01378: handler run complete 33932 1726882900.01381: Evaluated conditional (False): False 33932 1726882900.01383: attempt loop complete, returning result 33932 1726882900.01385: _execute() done 33932 1726882900.01387: dumping result to json 33932 1726882900.01389: done dumping result, returning 33932 1726882900.01390: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-615b-5c48-00000000083b] 33932 1726882900.01392: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083b 33932 1726882900.01462: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083b 33932 1726882900.01467: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "delta": "0:00:00.021246", "end": "2024-09-20 21:41:39.967330", "rc": 0, "start": "2024-09-20 21:41:39.946084" } STDOUT: lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 33932 1726882900.01549: no more pending results, returning what we have 33932 1726882900.01552: results queue empty 33932 1726882900.01553: checking for any_errors_fatal 33932 1726882900.01561: done checking for any_errors_fatal 33932 1726882900.01562: checking for max_fail_percentage 33932 1726882900.01566: done checking for max_fail_percentage 33932 1726882900.01567: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.01570: done checking to see if all hosts have failed 33932 1726882900.01571: getting the remaining hosts for this loop 33932 1726882900.01573: done getting the remaining hosts for this loop 33932 1726882900.01576: getting the next task for host managed_node1 33932 1726882900.01583: done getting next task for host managed_node1 33932 1726882900.01586: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 33932 1726882900.01590: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.01595: getting variables 33932 1726882900.01596: in VariableManager get_vars() 33932 1726882900.01639: Calling all_inventory to load vars for managed_node1 33932 1726882900.01642: Calling groups_inventory to load vars for managed_node1 33932 1726882900.01645: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.01657: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.01660: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.01665: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.03652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.05688: done with get_vars() 33932 1726882900.05709: done getting variables 33932 1726882900.05767: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:41:40 -0400 (0:00:00.419) 0:00:20.525 ****** 33932 1726882900.05804: entering _queue_task() for managed_node1/set_fact 33932 1726882900.06161: worker is 1 (out of 1 available) 33932 1726882900.06176: exiting _queue_task() for managed_node1/set_fact 33932 1726882900.06186: done queuing things up, now waiting for results queue to drain 33932 1726882900.06188: waiting for pending results... 33932 1726882900.06524: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 33932 1726882900.06647: in run() - task 0e448fcc-3ce9-615b-5c48-00000000083c 33932 1726882900.06675: variable 'ansible_search_path' from source: unknown 33932 1726882900.06685: variable 'ansible_search_path' from source: unknown 33932 1726882900.06722: calling self._execute() 33932 1726882900.06828: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.06839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.06854: variable 'omit' from source: magic vars 33932 1726882900.07252: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.07275: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.07456: variable 'nm_profile_exists' from source: set_fact 33932 1726882900.07481: Evaluated conditional (nm_profile_exists.rc == 0): True 33932 1726882900.07491: variable 'omit' from source: magic vars 33932 1726882900.07548: variable 'omit' from source: magic vars 33932 1726882900.07587: variable 'omit' from source: magic vars 33932 1726882900.07637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882900.07682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882900.07705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882900.07727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.07745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.07785: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882900.07794: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.07801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.07913: Set connection var ansible_shell_executable to /bin/sh 33932 1726882900.07926: Set connection var ansible_timeout to 10 33932 1726882900.07934: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882900.07943: Set connection var ansible_pipelining to False 33932 1726882900.07953: Set connection var ansible_connection to ssh 33932 1726882900.07959: Set connection var ansible_shell_type to sh 33932 1726882900.07993: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.08001: variable 'ansible_connection' from source: unknown 33932 1726882900.08007: variable 'ansible_module_compression' from source: unknown 33932 1726882900.08012: variable 'ansible_shell_type' from source: unknown 33932 1726882900.08018: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.08024: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.08031: variable 'ansible_pipelining' from source: unknown 33932 1726882900.08036: variable 'ansible_timeout' from source: unknown 33932 1726882900.08044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.08202: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882900.08220: variable 'omit' from source: magic vars 33932 1726882900.08230: starting attempt loop 33932 1726882900.08238: running the handler 33932 1726882900.08255: handler run complete 33932 1726882900.08277: attempt loop complete, returning result 33932 1726882900.08284: _execute() done 33932 1726882900.08291: dumping result to json 33932 1726882900.08301: done dumping result, returning 33932 1726882900.08312: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-615b-5c48-00000000083c] 33932 1726882900.08321: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083c ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 33932 1726882900.08473: no more pending results, returning what we have 33932 1726882900.08477: results queue empty 33932 1726882900.08478: checking for any_errors_fatal 33932 1726882900.08487: done checking for any_errors_fatal 33932 1726882900.08488: checking for max_fail_percentage 33932 1726882900.08490: done checking for max_fail_percentage 33932 1726882900.08491: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.08491: done checking to see if all hosts have failed 33932 1726882900.08492: getting the remaining hosts for this loop 33932 1726882900.08494: done getting the remaining hosts for this loop 33932 1726882900.08497: getting the next task for host managed_node1 33932 1726882900.08507: done getting next task for host managed_node1 33932 1726882900.08511: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 33932 1726882900.08515: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.08520: getting variables 33932 1726882900.08521: in VariableManager get_vars() 33932 1726882900.08565: Calling all_inventory to load vars for managed_node1 33932 1726882900.08571: Calling groups_inventory to load vars for managed_node1 33932 1726882900.08574: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.08586: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.08589: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.08592: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.09603: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083c 33932 1726882900.09607: WORKER PROCESS EXITING 33932 1726882900.10345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.12210: done with get_vars() 33932 1726882900.12238: done getting variables 33932 1726882900.12303: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882900.12543: variable 'profile' from source: include params 33932 1726882900.12547: variable 'item' from source: include params 33932 1726882900.12656: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101.90] ********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:41:40 -0400 (0:00:00.068) 0:00:20.594 ****** 33932 1726882900.12697: entering _queue_task() for managed_node1/command 33932 1726882900.12995: worker is 1 (out of 1 available) 33932 1726882900.13008: exiting _queue_task() for managed_node1/command 33932 1726882900.13020: done queuing things up, now waiting for results queue to drain 33932 1726882900.13022: waiting for pending results... 33932 1726882900.13324: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 33932 1726882900.13455: in run() - task 0e448fcc-3ce9-615b-5c48-00000000083e 33932 1726882900.13489: variable 'ansible_search_path' from source: unknown 33932 1726882900.13498: variable 'ansible_search_path' from source: unknown 33932 1726882900.13541: calling self._execute() 33932 1726882900.13663: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.13679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.13700: variable 'omit' from source: magic vars 33932 1726882900.14201: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.14361: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.14610: variable 'profile_stat' from source: set_fact 33932 1726882900.14628: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882900.14636: when evaluation is False, skipping this task 33932 1726882900.14643: _execute() done 33932 1726882900.14651: dumping result to json 33932 1726882900.14660: done dumping result, returning 33932 1726882900.14681: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 [0e448fcc-3ce9-615b-5c48-00000000083e] 33932 1726882900.14693: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083e skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882900.14847: no more pending results, returning what we have 33932 1726882900.14852: results queue empty 33932 1726882900.14853: checking for any_errors_fatal 33932 1726882900.14860: done checking for any_errors_fatal 33932 1726882900.14861: checking for max_fail_percentage 33932 1726882900.14865: done checking for max_fail_percentage 33932 1726882900.14867: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.14870: done checking to see if all hosts have failed 33932 1726882900.14871: getting the remaining hosts for this loop 33932 1726882900.14873: done getting the remaining hosts for this loop 33932 1726882900.14877: getting the next task for host managed_node1 33932 1726882900.14885: done getting next task for host managed_node1 33932 1726882900.14888: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 33932 1726882900.14892: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.14896: getting variables 33932 1726882900.14898: in VariableManager get_vars() 33932 1726882900.14943: Calling all_inventory to load vars for managed_node1 33932 1726882900.14946: Calling groups_inventory to load vars for managed_node1 33932 1726882900.14949: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.14963: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.14970: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.14974: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.16261: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083e 33932 1726882900.16266: WORKER PROCESS EXITING 33932 1726882900.17381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.20450: done with get_vars() 33932 1726882900.20488: done getting variables 33932 1726882900.20550: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882900.20670: variable 'profile' from source: include params 33932 1726882900.20678: variable 'item' from source: include params 33932 1726882900.20741: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101.90] ******************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:41:40 -0400 (0:00:00.080) 0:00:20.675 ****** 33932 1726882900.20776: entering _queue_task() for managed_node1/set_fact 33932 1726882900.21118: worker is 1 (out of 1 available) 33932 1726882900.21132: exiting _queue_task() for managed_node1/set_fact 33932 1726882900.21143: done queuing things up, now waiting for results queue to drain 33932 1726882900.21144: waiting for pending results... 33932 1726882900.21463: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 33932 1726882900.21592: in run() - task 0e448fcc-3ce9-615b-5c48-00000000083f 33932 1726882900.21613: variable 'ansible_search_path' from source: unknown 33932 1726882900.21620: variable 'ansible_search_path' from source: unknown 33932 1726882900.21666: calling self._execute() 33932 1726882900.21781: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.21792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.21808: variable 'omit' from source: magic vars 33932 1726882900.22174: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.22191: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.22331: variable 'profile_stat' from source: set_fact 33932 1726882900.22354: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882900.22361: when evaluation is False, skipping this task 33932 1726882900.22372: _execute() done 33932 1726882900.22379: dumping result to json 33932 1726882900.22386: done dumping result, returning 33932 1726882900.22395: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 [0e448fcc-3ce9-615b-5c48-00000000083f] 33932 1726882900.22416: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083f 33932 1726882900.22530: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000083f 33932 1726882900.22541: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882900.22607: no more pending results, returning what we have 33932 1726882900.22612: results queue empty 33932 1726882900.22613: checking for any_errors_fatal 33932 1726882900.22622: done checking for any_errors_fatal 33932 1726882900.22623: checking for max_fail_percentage 33932 1726882900.22625: done checking for max_fail_percentage 33932 1726882900.22626: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.22627: done checking to see if all hosts have failed 33932 1726882900.22628: getting the remaining hosts for this loop 33932 1726882900.22629: done getting the remaining hosts for this loop 33932 1726882900.22633: getting the next task for host managed_node1 33932 1726882900.22642: done getting next task for host managed_node1 33932 1726882900.22645: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 33932 1726882900.22650: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.22655: getting variables 33932 1726882900.22657: in VariableManager get_vars() 33932 1726882900.22707: Calling all_inventory to load vars for managed_node1 33932 1726882900.22711: Calling groups_inventory to load vars for managed_node1 33932 1726882900.22714: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.22728: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.22731: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.22734: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.24510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.28755: done with get_vars() 33932 1726882900.28790: done getting variables 33932 1726882900.28843: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882900.28951: variable 'profile' from source: include params 33932 1726882900.28955: variable 'item' from source: include params 33932 1726882900.29017: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101.90] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:41:40 -0400 (0:00:00.082) 0:00:20.758 ****** 33932 1726882900.29049: entering _queue_task() for managed_node1/command 33932 1726882900.29866: worker is 1 (out of 1 available) 33932 1726882900.29882: exiting _queue_task() for managed_node1/command 33932 1726882900.29893: done queuing things up, now waiting for results queue to drain 33932 1726882900.29895: waiting for pending results... 33932 1726882900.30533: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101.90 33932 1726882900.31307: in run() - task 0e448fcc-3ce9-615b-5c48-000000000840 33932 1726882900.31326: variable 'ansible_search_path' from source: unknown 33932 1726882900.31410: variable 'ansible_search_path' from source: unknown 33932 1726882900.31451: calling self._execute() 33932 1726882900.31666: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.31679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.31693: variable 'omit' from source: magic vars 33932 1726882900.32471: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.32612: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.32857: variable 'profile_stat' from source: set_fact 33932 1726882900.32881: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882900.32927: when evaluation is False, skipping this task 33932 1726882900.32935: _execute() done 33932 1726882900.32944: dumping result to json 33932 1726882900.32952: done dumping result, returning 33932 1726882900.32961: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101.90 [0e448fcc-3ce9-615b-5c48-000000000840] 33932 1726882900.32980: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000840 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882900.33189: no more pending results, returning what we have 33932 1726882900.33193: results queue empty 33932 1726882900.33195: checking for any_errors_fatal 33932 1726882900.33201: done checking for any_errors_fatal 33932 1726882900.33202: checking for max_fail_percentage 33932 1726882900.33204: done checking for max_fail_percentage 33932 1726882900.33205: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.33206: done checking to see if all hosts have failed 33932 1726882900.33207: getting the remaining hosts for this loop 33932 1726882900.33209: done getting the remaining hosts for this loop 33932 1726882900.33212: getting the next task for host managed_node1 33932 1726882900.33219: done getting next task for host managed_node1 33932 1726882900.33223: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 33932 1726882900.33228: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.33232: getting variables 33932 1726882900.33233: in VariableManager get_vars() 33932 1726882900.33281: Calling all_inventory to load vars for managed_node1 33932 1726882900.33284: Calling groups_inventory to load vars for managed_node1 33932 1726882900.33287: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.33302: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.33305: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.33308: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.34581: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000840 33932 1726882900.34584: WORKER PROCESS EXITING 33932 1726882900.35942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.38692: done with get_vars() 33932 1726882900.38719: done getting variables 33932 1726882900.38783: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882900.38905: variable 'profile' from source: include params 33932 1726882900.38909: variable 'item' from source: include params 33932 1726882900.38972: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101.90] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:41:40 -0400 (0:00:00.099) 0:00:20.857 ****** 33932 1726882900.39003: entering _queue_task() for managed_node1/set_fact 33932 1726882900.39333: worker is 1 (out of 1 available) 33932 1726882900.39351: exiting _queue_task() for managed_node1/set_fact 33932 1726882900.39363: done queuing things up, now waiting for results queue to drain 33932 1726882900.39371: waiting for pending results... 33932 1726882900.39654: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 33932 1726882900.39787: in run() - task 0e448fcc-3ce9-615b-5c48-000000000841 33932 1726882900.39814: variable 'ansible_search_path' from source: unknown 33932 1726882900.39823: variable 'ansible_search_path' from source: unknown 33932 1726882900.39866: calling self._execute() 33932 1726882900.39986: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.40003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.40016: variable 'omit' from source: magic vars 33932 1726882900.40422: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.40444: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.40583: variable 'profile_stat' from source: set_fact 33932 1726882900.40601: Evaluated conditional (profile_stat.stat.exists): False 33932 1726882900.40608: when evaluation is False, skipping this task 33932 1726882900.40616: _execute() done 33932 1726882900.40624: dumping result to json 33932 1726882900.40632: done dumping result, returning 33932 1726882900.40645: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 [0e448fcc-3ce9-615b-5c48-000000000841] 33932 1726882900.40658: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000841 33932 1726882900.40773: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000841 33932 1726882900.40783: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 33932 1726882900.40831: no more pending results, returning what we have 33932 1726882900.40836: results queue empty 33932 1726882900.40837: checking for any_errors_fatal 33932 1726882900.40843: done checking for any_errors_fatal 33932 1726882900.40844: checking for max_fail_percentage 33932 1726882900.40847: done checking for max_fail_percentage 33932 1726882900.40848: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.40849: done checking to see if all hosts have failed 33932 1726882900.40850: getting the remaining hosts for this loop 33932 1726882900.40852: done getting the remaining hosts for this loop 33932 1726882900.40855: getting the next task for host managed_node1 33932 1726882900.40870: done getting next task for host managed_node1 33932 1726882900.40874: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 33932 1726882900.40878: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.40884: getting variables 33932 1726882900.40886: in VariableManager get_vars() 33932 1726882900.40929: Calling all_inventory to load vars for managed_node1 33932 1726882900.40932: Calling groups_inventory to load vars for managed_node1 33932 1726882900.40934: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.40949: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.40952: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.40956: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.44891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.48301: done with get_vars() 33932 1726882900.48327: done getting variables 33932 1726882900.48897: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882900.49013: variable 'profile' from source: include params 33932 1726882900.49017: variable 'item' from source: include params 33932 1726882900.49080: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101.90'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:41:40 -0400 (0:00:00.101) 0:00:20.958 ****** 33932 1726882900.49109: entering _queue_task() for managed_node1/assert 33932 1726882900.49727: worker is 1 (out of 1 available) 33932 1726882900.49741: exiting _queue_task() for managed_node1/assert 33932 1726882900.49753: done queuing things up, now waiting for results queue to drain 33932 1726882900.49755: waiting for pending results... 33932 1726882900.50865: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101.90' 33932 1726882900.51476: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006c0 33932 1726882900.51482: variable 'ansible_search_path' from source: unknown 33932 1726882900.51485: variable 'ansible_search_path' from source: unknown 33932 1726882900.51524: calling self._execute() 33932 1726882900.51755: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.51759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.51824: variable 'omit' from source: magic vars 33932 1726882900.52546: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.52560: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.52565: variable 'omit' from source: magic vars 33932 1726882900.52727: variable 'omit' from source: magic vars 33932 1726882900.52965: variable 'profile' from source: include params 33932 1726882900.52972: variable 'item' from source: include params 33932 1726882900.53145: variable 'item' from source: include params 33932 1726882900.53166: variable 'omit' from source: magic vars 33932 1726882900.53207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882900.53353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882900.53373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882900.53391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.53402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.53432: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882900.53435: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.53438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.53651: Set connection var ansible_shell_executable to /bin/sh 33932 1726882900.53773: Set connection var ansible_timeout to 10 33932 1726882900.53781: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882900.53787: Set connection var ansible_pipelining to False 33932 1726882900.53789: Set connection var ansible_connection to ssh 33932 1726882900.53792: Set connection var ansible_shell_type to sh 33932 1726882900.53817: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.53820: variable 'ansible_connection' from source: unknown 33932 1726882900.53822: variable 'ansible_module_compression' from source: unknown 33932 1726882900.53825: variable 'ansible_shell_type' from source: unknown 33932 1726882900.53827: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.53830: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.53832: variable 'ansible_pipelining' from source: unknown 33932 1726882900.53834: variable 'ansible_timeout' from source: unknown 33932 1726882900.53839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.54115: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882900.54125: variable 'omit' from source: magic vars 33932 1726882900.54130: starting attempt loop 33932 1726882900.54133: running the handler 33932 1726882900.54355: variable 'lsr_net_profile_exists' from source: set_fact 33932 1726882900.54359: Evaluated conditional (lsr_net_profile_exists): True 33932 1726882900.54370: handler run complete 33932 1726882900.54382: attempt loop complete, returning result 33932 1726882900.54385: _execute() done 33932 1726882900.54388: dumping result to json 33932 1726882900.54390: done dumping result, returning 33932 1726882900.54397: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101.90' [0e448fcc-3ce9-615b-5c48-0000000006c0] 33932 1726882900.54401: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006c0 33932 1726882900.54656: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006c0 33932 1726882900.54659: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882900.54711: no more pending results, returning what we have 33932 1726882900.54714: results queue empty 33932 1726882900.54715: checking for any_errors_fatal 33932 1726882900.54722: done checking for any_errors_fatal 33932 1726882900.54723: checking for max_fail_percentage 33932 1726882900.54725: done checking for max_fail_percentage 33932 1726882900.54726: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.54727: done checking to see if all hosts have failed 33932 1726882900.54727: getting the remaining hosts for this loop 33932 1726882900.54729: done getting the remaining hosts for this loop 33932 1726882900.54732: getting the next task for host managed_node1 33932 1726882900.54738: done getting next task for host managed_node1 33932 1726882900.54741: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 33932 1726882900.54744: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.54747: getting variables 33932 1726882900.54749: in VariableManager get_vars() 33932 1726882900.54796: Calling all_inventory to load vars for managed_node1 33932 1726882900.54799: Calling groups_inventory to load vars for managed_node1 33932 1726882900.54801: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.54811: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.54813: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.54816: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.56915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.58980: done with get_vars() 33932 1726882900.59000: done getting variables 33932 1726882900.59055: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882900.59174: variable 'profile' from source: include params 33932 1726882900.59178: variable 'item' from source: include params 33932 1726882900.59232: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101.90'] ******* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:41:40 -0400 (0:00:00.101) 0:00:21.060 ****** 33932 1726882900.59304: entering _queue_task() for managed_node1/assert 33932 1726882900.59816: worker is 1 (out of 1 available) 33932 1726882900.59831: exiting _queue_task() for managed_node1/assert 33932 1726882900.59842: done queuing things up, now waiting for results queue to drain 33932 1726882900.59844: waiting for pending results... 33932 1726882900.60700: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101.90' 33932 1726882900.60787: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006c1 33932 1726882900.60809: variable 'ansible_search_path' from source: unknown 33932 1726882900.60814: variable 'ansible_search_path' from source: unknown 33932 1726882900.60846: calling self._execute() 33932 1726882900.60941: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.60945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.60955: variable 'omit' from source: magic vars 33932 1726882900.61301: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.61312: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.61318: variable 'omit' from source: magic vars 33932 1726882900.61360: variable 'omit' from source: magic vars 33932 1726882900.61459: variable 'profile' from source: include params 33932 1726882900.61463: variable 'item' from source: include params 33932 1726882900.61525: variable 'item' from source: include params 33932 1726882900.61543: variable 'omit' from source: magic vars 33932 1726882900.61587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882900.61618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882900.61637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882900.61658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.61673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.61699: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882900.61702: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.61705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.61806: Set connection var ansible_shell_executable to /bin/sh 33932 1726882900.61813: Set connection var ansible_timeout to 10 33932 1726882900.61819: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882900.61824: Set connection var ansible_pipelining to False 33932 1726882900.61827: Set connection var ansible_connection to ssh 33932 1726882900.61829: Set connection var ansible_shell_type to sh 33932 1726882900.61854: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.61857: variable 'ansible_connection' from source: unknown 33932 1726882900.61860: variable 'ansible_module_compression' from source: unknown 33932 1726882900.61862: variable 'ansible_shell_type' from source: unknown 33932 1726882900.61866: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.61872: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.61875: variable 'ansible_pipelining' from source: unknown 33932 1726882900.61878: variable 'ansible_timeout' from source: unknown 33932 1726882900.61882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.62010: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882900.62020: variable 'omit' from source: magic vars 33932 1726882900.62026: starting attempt loop 33932 1726882900.62029: running the handler 33932 1726882900.62135: variable 'lsr_net_profile_ansible_managed' from source: set_fact 33932 1726882900.62138: Evaluated conditional (lsr_net_profile_ansible_managed): True 33932 1726882900.62146: handler run complete 33932 1726882900.62159: attempt loop complete, returning result 33932 1726882900.62162: _execute() done 33932 1726882900.62167: dumping result to json 33932 1726882900.62173: done dumping result, returning 33932 1726882900.62179: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101.90' [0e448fcc-3ce9-615b-5c48-0000000006c1] 33932 1726882900.62182: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006c1 33932 1726882900.62271: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006c1 33932 1726882900.62274: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882900.62328: no more pending results, returning what we have 33932 1726882900.62331: results queue empty 33932 1726882900.62332: checking for any_errors_fatal 33932 1726882900.62340: done checking for any_errors_fatal 33932 1726882900.62341: checking for max_fail_percentage 33932 1726882900.62343: done checking for max_fail_percentage 33932 1726882900.62344: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.62345: done checking to see if all hosts have failed 33932 1726882900.62345: getting the remaining hosts for this loop 33932 1726882900.62347: done getting the remaining hosts for this loop 33932 1726882900.62351: getting the next task for host managed_node1 33932 1726882900.62356: done getting next task for host managed_node1 33932 1726882900.62358: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 33932 1726882900.62361: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.62366: getting variables 33932 1726882900.62370: in VariableManager get_vars() 33932 1726882900.62406: Calling all_inventory to load vars for managed_node1 33932 1726882900.62409: Calling groups_inventory to load vars for managed_node1 33932 1726882900.62411: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.62421: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.62424: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.62426: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.64810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.66614: done with get_vars() 33932 1726882900.66642: done getting variables 33932 1726882900.66707: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882900.66847: variable 'profile' from source: include params 33932 1726882900.66851: variable 'item' from source: include params 33932 1726882900.66945: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101.90] ************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:41:40 -0400 (0:00:00.077) 0:00:21.137 ****** 33932 1726882900.67024: entering _queue_task() for managed_node1/assert 33932 1726882900.67930: worker is 1 (out of 1 available) 33932 1726882900.67943: exiting _queue_task() for managed_node1/assert 33932 1726882900.67954: done queuing things up, now waiting for results queue to drain 33932 1726882900.67956: waiting for pending results... 33932 1726882900.68574: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101.90 33932 1726882900.68691: in run() - task 0e448fcc-3ce9-615b-5c48-0000000006c2 33932 1726882900.68707: variable 'ansible_search_path' from source: unknown 33932 1726882900.68711: variable 'ansible_search_path' from source: unknown 33932 1726882900.68741: calling self._execute() 33932 1726882900.68842: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.68846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.68858: variable 'omit' from source: magic vars 33932 1726882900.69456: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.69470: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.69482: variable 'omit' from source: magic vars 33932 1726882900.69517: variable 'omit' from source: magic vars 33932 1726882900.69722: variable 'profile' from source: include params 33932 1726882900.69726: variable 'item' from source: include params 33932 1726882900.69903: variable 'item' from source: include params 33932 1726882900.69921: variable 'omit' from source: magic vars 33932 1726882900.69961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882900.70108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882900.70127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882900.70144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.70155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.70310: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882900.70314: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.70317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.70537: Set connection var ansible_shell_executable to /bin/sh 33932 1726882900.70545: Set connection var ansible_timeout to 10 33932 1726882900.70550: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882900.70555: Set connection var ansible_pipelining to False 33932 1726882900.70558: Set connection var ansible_connection to ssh 33932 1726882900.70561: Set connection var ansible_shell_type to sh 33932 1726882900.70592: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.70595: variable 'ansible_connection' from source: unknown 33932 1726882900.70597: variable 'ansible_module_compression' from source: unknown 33932 1726882900.70600: variable 'ansible_shell_type' from source: unknown 33932 1726882900.70602: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.70604: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.70606: variable 'ansible_pipelining' from source: unknown 33932 1726882900.70609: variable 'ansible_timeout' from source: unknown 33932 1726882900.70659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.71121: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882900.71131: variable 'omit' from source: magic vars 33932 1726882900.71136: starting attempt loop 33932 1726882900.71139: running the handler 33932 1726882900.71364: variable 'lsr_net_profile_fingerprint' from source: set_fact 33932 1726882900.71370: Evaluated conditional (lsr_net_profile_fingerprint): True 33932 1726882900.71382: handler run complete 33932 1726882900.71398: attempt loop complete, returning result 33932 1726882900.71401: _execute() done 33932 1726882900.71403: dumping result to json 33932 1726882900.71406: done dumping result, returning 33932 1726882900.71412: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101.90 [0e448fcc-3ce9-615b-5c48-0000000006c2] 33932 1726882900.71418: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006c2 33932 1726882900.71510: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000006c2 33932 1726882900.71513: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 33932 1726882900.71559: no more pending results, returning what we have 33932 1726882900.71562: results queue empty 33932 1726882900.71564: checking for any_errors_fatal 33932 1726882900.71571: done checking for any_errors_fatal 33932 1726882900.71572: checking for max_fail_percentage 33932 1726882900.71574: done checking for max_fail_percentage 33932 1726882900.71575: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.71576: done checking to see if all hosts have failed 33932 1726882900.71576: getting the remaining hosts for this loop 33932 1726882900.71578: done getting the remaining hosts for this loop 33932 1726882900.71581: getting the next task for host managed_node1 33932 1726882900.71588: done getting next task for host managed_node1 33932 1726882900.71591: ^ task is: TASK: TEARDOWN: remove profiles. 33932 1726882900.71593: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.71597: getting variables 33932 1726882900.71599: in VariableManager get_vars() 33932 1726882900.71639: Calling all_inventory to load vars for managed_node1 33932 1726882900.71642: Calling groups_inventory to load vars for managed_node1 33932 1726882900.71645: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.71655: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.71657: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.71660: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.73823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.75819: done with get_vars() 33932 1726882900.75845: done getting variables 33932 1726882900.75908: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:58 Friday 20 September 2024 21:41:40 -0400 (0:00:00.089) 0:00:21.227 ****** 33932 1726882900.75941: entering _queue_task() for managed_node1/debug 33932 1726882900.76271: worker is 1 (out of 1 available) 33932 1726882900.76285: exiting _queue_task() for managed_node1/debug 33932 1726882900.76295: done queuing things up, now waiting for results queue to drain 33932 1726882900.76297: waiting for pending results... 33932 1726882900.76581: running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. 33932 1726882900.76680: in run() - task 0e448fcc-3ce9-615b-5c48-00000000005d 33932 1726882900.76696: variable 'ansible_search_path' from source: unknown 33932 1726882900.76732: calling self._execute() 33932 1726882900.76838: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.76844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.76863: variable 'omit' from source: magic vars 33932 1726882900.77258: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.77275: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.77281: variable 'omit' from source: magic vars 33932 1726882900.77305: variable 'omit' from source: magic vars 33932 1726882900.77338: variable 'omit' from source: magic vars 33932 1726882900.77389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882900.77426: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882900.77445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882900.77468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.77484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882900.77517: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882900.77521: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.77523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.77626: Set connection var ansible_shell_executable to /bin/sh 33932 1726882900.77633: Set connection var ansible_timeout to 10 33932 1726882900.77638: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882900.77643: Set connection var ansible_pipelining to False 33932 1726882900.77645: Set connection var ansible_connection to ssh 33932 1726882900.77648: Set connection var ansible_shell_type to sh 33932 1726882900.77681: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.77684: variable 'ansible_connection' from source: unknown 33932 1726882900.77687: variable 'ansible_module_compression' from source: unknown 33932 1726882900.77689: variable 'ansible_shell_type' from source: unknown 33932 1726882900.77692: variable 'ansible_shell_executable' from source: unknown 33932 1726882900.77694: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.77696: variable 'ansible_pipelining' from source: unknown 33932 1726882900.77699: variable 'ansible_timeout' from source: unknown 33932 1726882900.77703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.77838: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882900.77849: variable 'omit' from source: magic vars 33932 1726882900.77854: starting attempt loop 33932 1726882900.77857: running the handler 33932 1726882900.77919: handler run complete 33932 1726882900.77942: attempt loop complete, returning result 33932 1726882900.77945: _execute() done 33932 1726882900.77947: dumping result to json 33932 1726882900.77950: done dumping result, returning 33932 1726882900.77957: done running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. [0e448fcc-3ce9-615b-5c48-00000000005d] 33932 1726882900.77962: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000005d 33932 1726882900.78059: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000005d 33932 1726882900.78062: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 33932 1726882900.78121: no more pending results, returning what we have 33932 1726882900.78125: results queue empty 33932 1726882900.78126: checking for any_errors_fatal 33932 1726882900.78134: done checking for any_errors_fatal 33932 1726882900.78134: checking for max_fail_percentage 33932 1726882900.78137: done checking for max_fail_percentage 33932 1726882900.78138: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.78139: done checking to see if all hosts have failed 33932 1726882900.78140: getting the remaining hosts for this loop 33932 1726882900.78142: done getting the remaining hosts for this loop 33932 1726882900.78146: getting the next task for host managed_node1 33932 1726882900.78154: done getting next task for host managed_node1 33932 1726882900.78161: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33932 1726882900.78165: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.78190: getting variables 33932 1726882900.78192: in VariableManager get_vars() 33932 1726882900.78236: Calling all_inventory to load vars for managed_node1 33932 1726882900.78239: Calling groups_inventory to load vars for managed_node1 33932 1726882900.78242: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.78254: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.78257: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.78260: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.80388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.82282: done with get_vars() 33932 1726882900.82308: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:40 -0400 (0:00:00.064) 0:00:21.291 ****** 33932 1726882900.82402: entering _queue_task() for managed_node1/include_tasks 33932 1726882900.82694: worker is 1 (out of 1 available) 33932 1726882900.82706: exiting _queue_task() for managed_node1/include_tasks 33932 1726882900.82717: done queuing things up, now waiting for results queue to drain 33932 1726882900.82719: waiting for pending results... 33932 1726882900.83002: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33932 1726882900.83132: in run() - task 0e448fcc-3ce9-615b-5c48-000000000065 33932 1726882900.83145: variable 'ansible_search_path' from source: unknown 33932 1726882900.83149: variable 'ansible_search_path' from source: unknown 33932 1726882900.83191: calling self._execute() 33932 1726882900.83290: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.83294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.83305: variable 'omit' from source: magic vars 33932 1726882900.83670: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.83685: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.83691: _execute() done 33932 1726882900.83696: dumping result to json 33932 1726882900.83698: done dumping result, returning 33932 1726882900.83711: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-615b-5c48-000000000065] 33932 1726882900.83720: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000065 33932 1726882900.83806: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000065 33932 1726882900.83809: WORKER PROCESS EXITING 33932 1726882900.83857: no more pending results, returning what we have 33932 1726882900.83862: in VariableManager get_vars() 33932 1726882900.83919: Calling all_inventory to load vars for managed_node1 33932 1726882900.83922: Calling groups_inventory to load vars for managed_node1 33932 1726882900.83925: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.83937: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.83940: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.83943: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.90609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.92388: done with get_vars() 33932 1726882900.92409: variable 'ansible_search_path' from source: unknown 33932 1726882900.92410: variable 'ansible_search_path' from source: unknown 33932 1726882900.92455: we have included files to process 33932 1726882900.92456: generating all_blocks data 33932 1726882900.92458: done generating all_blocks data 33932 1726882900.92461: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 33932 1726882900.92462: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 33932 1726882900.92467: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 33932 1726882900.93044: done processing included file 33932 1726882900.93046: iterating over new_blocks loaded from include file 33932 1726882900.93048: in VariableManager get_vars() 33932 1726882900.93077: done with get_vars() 33932 1726882900.93079: filtering new block on tags 33932 1726882900.93101: done filtering new block on tags 33932 1726882900.93104: in VariableManager get_vars() 33932 1726882900.93126: done with get_vars() 33932 1726882900.93128: filtering new block on tags 33932 1726882900.93147: done filtering new block on tags 33932 1726882900.93150: in VariableManager get_vars() 33932 1726882900.93176: done with get_vars() 33932 1726882900.93177: filtering new block on tags 33932 1726882900.93195: done filtering new block on tags 33932 1726882900.93197: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 33932 1726882900.93206: extending task lists for all hosts with included blocks 33932 1726882900.94055: done extending task lists 33932 1726882900.94056: done processing included files 33932 1726882900.94057: results queue empty 33932 1726882900.94058: checking for any_errors_fatal 33932 1726882900.94060: done checking for any_errors_fatal 33932 1726882900.94061: checking for max_fail_percentage 33932 1726882900.94062: done checking for max_fail_percentage 33932 1726882900.94070: checking to see if all hosts have failed and the running result is not ok 33932 1726882900.94071: done checking to see if all hosts have failed 33932 1726882900.94072: getting the remaining hosts for this loop 33932 1726882900.94073: done getting the remaining hosts for this loop 33932 1726882900.94076: getting the next task for host managed_node1 33932 1726882900.94079: done getting next task for host managed_node1 33932 1726882900.94082: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 33932 1726882900.94084: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882900.94094: getting variables 33932 1726882900.94095: in VariableManager get_vars() 33932 1726882900.94110: Calling all_inventory to load vars for managed_node1 33932 1726882900.94112: Calling groups_inventory to load vars for managed_node1 33932 1726882900.94114: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882900.94119: Calling all_plugins_play to load vars for managed_node1 33932 1726882900.94121: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882900.94124: Calling groups_plugins_play to load vars for managed_node1 33932 1726882900.95486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882900.97277: done with get_vars() 33932 1726882900.97296: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:40 -0400 (0:00:00.149) 0:00:21.441 ****** 33932 1726882900.97375: entering _queue_task() for managed_node1/setup 33932 1726882900.97727: worker is 1 (out of 1 available) 33932 1726882900.97746: exiting _queue_task() for managed_node1/setup 33932 1726882900.97760: done queuing things up, now waiting for results queue to drain 33932 1726882900.97762: waiting for pending results... 33932 1726882900.98112: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 33932 1726882900.98291: in run() - task 0e448fcc-3ce9-615b-5c48-000000000883 33932 1726882900.98312: variable 'ansible_search_path' from source: unknown 33932 1726882900.98320: variable 'ansible_search_path' from source: unknown 33932 1726882900.98363: calling self._execute() 33932 1726882900.98467: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882900.98485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882900.98499: variable 'omit' from source: magic vars 33932 1726882900.98885: variable 'ansible_distribution_major_version' from source: facts 33932 1726882900.98904: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882900.99499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882901.01792: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882901.01866: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882901.01907: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882901.01943: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882901.01977: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882901.02061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882901.02097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882901.02128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882901.02178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882901.02200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882901.02255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882901.02287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882901.02317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882901.02361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882901.02384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882901.02538: variable '__network_required_facts' from source: role '' defaults 33932 1726882901.02552: variable 'ansible_facts' from source: unknown 33932 1726882901.03237: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 33932 1726882901.03246: when evaluation is False, skipping this task 33932 1726882901.03253: _execute() done 33932 1726882901.03259: dumping result to json 33932 1726882901.03267: done dumping result, returning 33932 1726882901.03279: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-615b-5c48-000000000883] 33932 1726882901.03287: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000883 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882901.03425: no more pending results, returning what we have 33932 1726882901.03429: results queue empty 33932 1726882901.03430: checking for any_errors_fatal 33932 1726882901.03432: done checking for any_errors_fatal 33932 1726882901.03433: checking for max_fail_percentage 33932 1726882901.03435: done checking for max_fail_percentage 33932 1726882901.03436: checking to see if all hosts have failed and the running result is not ok 33932 1726882901.03436: done checking to see if all hosts have failed 33932 1726882901.03437: getting the remaining hosts for this loop 33932 1726882901.03439: done getting the remaining hosts for this loop 33932 1726882901.03443: getting the next task for host managed_node1 33932 1726882901.03450: done getting next task for host managed_node1 33932 1726882901.03454: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 33932 1726882901.03458: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882901.03484: getting variables 33932 1726882901.03486: in VariableManager get_vars() 33932 1726882901.03525: Calling all_inventory to load vars for managed_node1 33932 1726882901.03528: Calling groups_inventory to load vars for managed_node1 33932 1726882901.03530: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882901.03541: Calling all_plugins_play to load vars for managed_node1 33932 1726882901.03543: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882901.03546: Calling groups_plugins_play to load vars for managed_node1 33932 1726882901.04066: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000883 33932 1726882901.04071: WORKER PROCESS EXITING 33932 1726882901.05049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882901.06770: done with get_vars() 33932 1726882901.06794: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:41 -0400 (0:00:00.095) 0:00:21.536 ****** 33932 1726882901.06902: entering _queue_task() for managed_node1/stat 33932 1726882901.07217: worker is 1 (out of 1 available) 33932 1726882901.07228: exiting _queue_task() for managed_node1/stat 33932 1726882901.07239: done queuing things up, now waiting for results queue to drain 33932 1726882901.07241: waiting for pending results... 33932 1726882901.07513: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 33932 1726882901.07669: in run() - task 0e448fcc-3ce9-615b-5c48-000000000885 33932 1726882901.07692: variable 'ansible_search_path' from source: unknown 33932 1726882901.07699: variable 'ansible_search_path' from source: unknown 33932 1726882901.07735: calling self._execute() 33932 1726882901.07839: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882901.07848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882901.07861: variable 'omit' from source: magic vars 33932 1726882901.08225: variable 'ansible_distribution_major_version' from source: facts 33932 1726882901.08242: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882901.08405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882901.08678: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882901.08726: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882901.08770: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882901.08808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882901.09211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882901.09240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882901.09273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882901.09308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882901.09393: variable '__network_is_ostree' from source: set_fact 33932 1726882901.09404: Evaluated conditional (not __network_is_ostree is defined): False 33932 1726882901.09415: when evaluation is False, skipping this task 33932 1726882901.09422: _execute() done 33932 1726882901.09429: dumping result to json 33932 1726882901.09438: done dumping result, returning 33932 1726882901.09449: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-615b-5c48-000000000885] 33932 1726882901.09461: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000885 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 33932 1726882901.09605: no more pending results, returning what we have 33932 1726882901.09609: results queue empty 33932 1726882901.09610: checking for any_errors_fatal 33932 1726882901.09618: done checking for any_errors_fatal 33932 1726882901.09619: checking for max_fail_percentage 33932 1726882901.09621: done checking for max_fail_percentage 33932 1726882901.09622: checking to see if all hosts have failed and the running result is not ok 33932 1726882901.09623: done checking to see if all hosts have failed 33932 1726882901.09623: getting the remaining hosts for this loop 33932 1726882901.09625: done getting the remaining hosts for this loop 33932 1726882901.09629: getting the next task for host managed_node1 33932 1726882901.09636: done getting next task for host managed_node1 33932 1726882901.09640: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 33932 1726882901.09644: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882901.09661: getting variables 33932 1726882901.09665: in VariableManager get_vars() 33932 1726882901.09705: Calling all_inventory to load vars for managed_node1 33932 1726882901.09707: Calling groups_inventory to load vars for managed_node1 33932 1726882901.09710: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882901.09721: Calling all_plugins_play to load vars for managed_node1 33932 1726882901.09723: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882901.09726: Calling groups_plugins_play to load vars for managed_node1 33932 1726882901.10966: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000885 33932 1726882901.10970: WORKER PROCESS EXITING 33932 1726882901.11488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882901.13130: done with get_vars() 33932 1726882901.13153: done getting variables 33932 1726882901.13215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:41 -0400 (0:00:00.063) 0:00:21.600 ****** 33932 1726882901.13251: entering _queue_task() for managed_node1/set_fact 33932 1726882901.13542: worker is 1 (out of 1 available) 33932 1726882901.13555: exiting _queue_task() for managed_node1/set_fact 33932 1726882901.13568: done queuing things up, now waiting for results queue to drain 33932 1726882901.13569: waiting for pending results... 33932 1726882901.13851: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 33932 1726882901.14027: in run() - task 0e448fcc-3ce9-615b-5c48-000000000886 33932 1726882901.14047: variable 'ansible_search_path' from source: unknown 33932 1726882901.14055: variable 'ansible_search_path' from source: unknown 33932 1726882901.14096: calling self._execute() 33932 1726882901.14200: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882901.14211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882901.14226: variable 'omit' from source: magic vars 33932 1726882901.14603: variable 'ansible_distribution_major_version' from source: facts 33932 1726882901.14620: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882901.14798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882901.15068: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882901.15121: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882901.15161: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882901.15202: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882901.15342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882901.15374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882901.15406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882901.15443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882901.15540: variable '__network_is_ostree' from source: set_fact 33932 1726882901.15555: Evaluated conditional (not __network_is_ostree is defined): False 33932 1726882901.15562: when evaluation is False, skipping this task 33932 1726882901.15571: _execute() done 33932 1726882901.15578: dumping result to json 33932 1726882901.15585: done dumping result, returning 33932 1726882901.15594: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-615b-5c48-000000000886] 33932 1726882901.15604: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000886 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 33932 1726882901.15744: no more pending results, returning what we have 33932 1726882901.15747: results queue empty 33932 1726882901.15749: checking for any_errors_fatal 33932 1726882901.15757: done checking for any_errors_fatal 33932 1726882901.15757: checking for max_fail_percentage 33932 1726882901.15759: done checking for max_fail_percentage 33932 1726882901.15760: checking to see if all hosts have failed and the running result is not ok 33932 1726882901.15761: done checking to see if all hosts have failed 33932 1726882901.15762: getting the remaining hosts for this loop 33932 1726882901.15765: done getting the remaining hosts for this loop 33932 1726882901.15769: getting the next task for host managed_node1 33932 1726882901.15777: done getting next task for host managed_node1 33932 1726882901.15781: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 33932 1726882901.15786: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882901.15805: getting variables 33932 1726882901.15808: in VariableManager get_vars() 33932 1726882901.15846: Calling all_inventory to load vars for managed_node1 33932 1726882901.15849: Calling groups_inventory to load vars for managed_node1 33932 1726882901.15851: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882901.15862: Calling all_plugins_play to load vars for managed_node1 33932 1726882901.15866: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882901.15869: Calling groups_plugins_play to load vars for managed_node1 33932 1726882901.17056: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000886 33932 1726882901.17060: WORKER PROCESS EXITING 33932 1726882901.17072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882901.18103: done with get_vars() 33932 1726882901.18118: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:41 -0400 (0:00:00.049) 0:00:21.649 ****** 33932 1726882901.18187: entering _queue_task() for managed_node1/service_facts 33932 1726882901.18442: worker is 1 (out of 1 available) 33932 1726882901.18455: exiting _queue_task() for managed_node1/service_facts 33932 1726882901.18468: done queuing things up, now waiting for results queue to drain 33932 1726882901.18470: waiting for pending results... 33932 1726882901.18750: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 33932 1726882901.18894: in run() - task 0e448fcc-3ce9-615b-5c48-000000000888 33932 1726882901.18907: variable 'ansible_search_path' from source: unknown 33932 1726882901.18910: variable 'ansible_search_path' from source: unknown 33932 1726882901.18951: calling self._execute() 33932 1726882901.19047: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882901.19051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882901.19060: variable 'omit' from source: magic vars 33932 1726882901.19429: variable 'ansible_distribution_major_version' from source: facts 33932 1726882901.19438: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882901.19441: variable 'omit' from source: magic vars 33932 1726882901.19523: variable 'omit' from source: magic vars 33932 1726882901.19559: variable 'omit' from source: magic vars 33932 1726882901.19594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882901.19627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882901.19644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882901.19660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882901.19674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882901.19709: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882901.19713: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882901.19716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882901.19819: Set connection var ansible_shell_executable to /bin/sh 33932 1726882901.19825: Set connection var ansible_timeout to 10 33932 1726882901.19831: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882901.19835: Set connection var ansible_pipelining to False 33932 1726882901.19838: Set connection var ansible_connection to ssh 33932 1726882901.19841: Set connection var ansible_shell_type to sh 33932 1726882901.19857: variable 'ansible_shell_executable' from source: unknown 33932 1726882901.19860: variable 'ansible_connection' from source: unknown 33932 1726882901.19866: variable 'ansible_module_compression' from source: unknown 33932 1726882901.19871: variable 'ansible_shell_type' from source: unknown 33932 1726882901.19874: variable 'ansible_shell_executable' from source: unknown 33932 1726882901.19881: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882901.19905: variable 'ansible_pipelining' from source: unknown 33932 1726882901.19910: variable 'ansible_timeout' from source: unknown 33932 1726882901.19912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882901.20045: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882901.20053: variable 'omit' from source: magic vars 33932 1726882901.20058: starting attempt loop 33932 1726882901.20061: running the handler 33932 1726882901.20077: _low_level_execute_command(): starting 33932 1726882901.20084: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882901.20579: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882901.20595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882901.20607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882901.20619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882901.20680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882901.20697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882901.20791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882901.22451: stdout chunk (state=3): >>>/root <<< 33932 1726882901.22606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882901.22610: stderr chunk (state=3): >>><<< 33932 1726882901.22615: stdout chunk (state=3): >>><<< 33932 1726882901.22638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882901.22651: _low_level_execute_command(): starting 33932 1726882901.22656: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144 `" && echo ansible-tmp-1726882901.2263682-34951-19405662051144="` echo /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144 `" ) && sleep 0' 33932 1726882901.23252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882901.23262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882901.23288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882901.23302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882901.23374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882901.23382: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882901.23394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882901.23403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882901.23410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882901.23417: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882901.23424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882901.23434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882901.23445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882901.23452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882901.23459: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882901.23473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882901.23561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882901.23591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882901.23593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882901.23686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882901.25565: stdout chunk (state=3): >>>ansible-tmp-1726882901.2263682-34951-19405662051144=/root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144 <<< 33932 1726882901.25691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882901.25724: stderr chunk (state=3): >>><<< 33932 1726882901.25726: stdout chunk (state=3): >>><<< 33932 1726882901.25769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882901.2263682-34951-19405662051144=/root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882901.25779: variable 'ansible_module_compression' from source: unknown 33932 1726882901.26325: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 33932 1726882901.26328: variable 'ansible_facts' from source: unknown 33932 1726882901.26331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144/AnsiballZ_service_facts.py 33932 1726882901.26333: Sending initial data 33932 1726882901.26335: Sent initial data (161 bytes) 33932 1726882901.26989: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882901.27016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882901.27030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882901.27198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882901.27237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882901.27240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882901.27338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882901.29065: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882901.29154: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882901.29247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpb2m02lum /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144/AnsiballZ_service_facts.py <<< 33932 1726882901.29340: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882901.30584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882901.30713: stderr chunk (state=3): >>><<< 33932 1726882901.30716: stdout chunk (state=3): >>><<< 33932 1726882901.30730: done transferring module to remote 33932 1726882901.30739: _low_level_execute_command(): starting 33932 1726882901.30744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144/ /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144/AnsiballZ_service_facts.py && sleep 0' 33932 1726882901.31203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882901.31206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882901.31239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882901.31243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882901.31245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882901.31297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882901.31300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882901.31407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882901.33141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882901.33192: stderr chunk (state=3): >>><<< 33932 1726882901.33195: stdout chunk (state=3): >>><<< 33932 1726882901.33203: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882901.33206: _low_level_execute_command(): starting 33932 1726882901.33211: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144/AnsiballZ_service_facts.py && sleep 0' 33932 1726882901.33633: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882901.33636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882901.33672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882901.33676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882901.33678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882901.33680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882901.33727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882901.33731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882901.33836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882902.65558: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 33932 1726882902.65579: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 33932 1726882902.65591: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 33932 1726882902.65594: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "system<<< 33932 1726882902.65613: stdout chunk (state=3): >>>d"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 33932 1726882902.66790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882902.66876: stderr chunk (state=3): >>><<< 33932 1726882902.66880: stdout chunk (state=3): >>><<< 33932 1726882902.66914: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882902.67568: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882902.67580: _low_level_execute_command(): starting 33932 1726882902.67585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882901.2263682-34951-19405662051144/ > /dev/null 2>&1 && sleep 0' 33932 1726882902.68281: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882902.68292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.68302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.68316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.68354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.68362: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882902.68378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.68392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882902.68403: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882902.68409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882902.68418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.68426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.68437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.68445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.68451: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882902.68460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.68543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882902.68556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882902.68568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882902.68730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882902.70516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882902.70600: stderr chunk (state=3): >>><<< 33932 1726882902.70605: stdout chunk (state=3): >>><<< 33932 1726882902.70624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882902.70630: handler run complete 33932 1726882902.70825: variable 'ansible_facts' from source: unknown 33932 1726882902.70999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882902.71466: variable 'ansible_facts' from source: unknown 33932 1726882902.71603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882902.71804: attempt loop complete, returning result 33932 1726882902.71808: _execute() done 33932 1726882902.71810: dumping result to json 33932 1726882902.71873: done dumping result, returning 33932 1726882902.71884: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-615b-5c48-000000000888] 33932 1726882902.71889: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000888 33932 1726882902.72579: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000888 33932 1726882902.72582: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882902.72686: no more pending results, returning what we have 33932 1726882902.72689: results queue empty 33932 1726882902.72690: checking for any_errors_fatal 33932 1726882902.72696: done checking for any_errors_fatal 33932 1726882902.72697: checking for max_fail_percentage 33932 1726882902.72698: done checking for max_fail_percentage 33932 1726882902.72699: checking to see if all hosts have failed and the running result is not ok 33932 1726882902.72700: done checking to see if all hosts have failed 33932 1726882902.72701: getting the remaining hosts for this loop 33932 1726882902.72702: done getting the remaining hosts for this loop 33932 1726882902.72706: getting the next task for host managed_node1 33932 1726882902.72712: done getting next task for host managed_node1 33932 1726882902.72716: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 33932 1726882902.72719: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882902.72730: getting variables 33932 1726882902.72732: in VariableManager get_vars() 33932 1726882902.72777: Calling all_inventory to load vars for managed_node1 33932 1726882902.72781: Calling groups_inventory to load vars for managed_node1 33932 1726882902.72784: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882902.72795: Calling all_plugins_play to load vars for managed_node1 33932 1726882902.72798: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882902.72806: Calling groups_plugins_play to load vars for managed_node1 33932 1726882902.74517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882902.76351: done with get_vars() 33932 1726882902.76377: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:42 -0400 (0:00:01.582) 0:00:23.232 ****** 33932 1726882902.76477: entering _queue_task() for managed_node1/package_facts 33932 1726882902.76786: worker is 1 (out of 1 available) 33932 1726882902.76799: exiting _queue_task() for managed_node1/package_facts 33932 1726882902.76810: done queuing things up, now waiting for results queue to drain 33932 1726882902.76812: waiting for pending results... 33932 1726882902.77106: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 33932 1726882902.77238: in run() - task 0e448fcc-3ce9-615b-5c48-000000000889 33932 1726882902.77251: variable 'ansible_search_path' from source: unknown 33932 1726882902.77258: variable 'ansible_search_path' from source: unknown 33932 1726882902.77300: calling self._execute() 33932 1726882902.77419: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882902.77428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882902.77438: variable 'omit' from source: magic vars 33932 1726882902.77852: variable 'ansible_distribution_major_version' from source: facts 33932 1726882902.77873: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882902.77880: variable 'omit' from source: magic vars 33932 1726882902.77960: variable 'omit' from source: magic vars 33932 1726882902.78003: variable 'omit' from source: magic vars 33932 1726882902.78048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882902.78092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882902.78111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882902.78133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882902.78145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882902.78180: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882902.78183: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882902.78192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882902.78311: Set connection var ansible_shell_executable to /bin/sh 33932 1726882902.78319: Set connection var ansible_timeout to 10 33932 1726882902.78323: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882902.78329: Set connection var ansible_pipelining to False 33932 1726882902.78332: Set connection var ansible_connection to ssh 33932 1726882902.78334: Set connection var ansible_shell_type to sh 33932 1726882902.78362: variable 'ansible_shell_executable' from source: unknown 33932 1726882902.78366: variable 'ansible_connection' from source: unknown 33932 1726882902.78369: variable 'ansible_module_compression' from source: unknown 33932 1726882902.78373: variable 'ansible_shell_type' from source: unknown 33932 1726882902.78376: variable 'ansible_shell_executable' from source: unknown 33932 1726882902.78378: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882902.78383: variable 'ansible_pipelining' from source: unknown 33932 1726882902.78385: variable 'ansible_timeout' from source: unknown 33932 1726882902.78389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882902.78612: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882902.78621: variable 'omit' from source: magic vars 33932 1726882902.78630: starting attempt loop 33932 1726882902.78634: running the handler 33932 1726882902.78646: _low_level_execute_command(): starting 33932 1726882902.78654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882902.79464: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882902.79481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.79491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.79510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.79553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.79561: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882902.79576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.79591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882902.79599: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882902.79611: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882902.79619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.79630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.79642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.79651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.79660: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882902.79681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.79754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882902.79780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882902.79796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882902.79923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882902.81558: stdout chunk (state=3): >>>/root <<< 33932 1726882902.81679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882902.81732: stderr chunk (state=3): >>><<< 33932 1726882902.81735: stdout chunk (state=3): >>><<< 33932 1726882902.81758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882902.81770: _low_level_execute_command(): starting 33932 1726882902.81780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097 `" && echo ansible-tmp-1726882902.8175514-35005-237830775297097="` echo /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097 `" ) && sleep 0' 33932 1726882902.82409: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882902.82417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.82427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.82440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.82480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.82487: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882902.82498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.82514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882902.82521: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882902.82528: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882902.82535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.82544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.82555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.82562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.82570: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882902.82582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.82658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882902.82679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882902.82692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882902.82817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882902.84687: stdout chunk (state=3): >>>ansible-tmp-1726882902.8175514-35005-237830775297097=/root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097 <<< 33932 1726882902.84805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882902.84868: stderr chunk (state=3): >>><<< 33932 1726882902.84874: stdout chunk (state=3): >>><<< 33932 1726882902.84892: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882902.8175514-35005-237830775297097=/root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882902.84932: variable 'ansible_module_compression' from source: unknown 33932 1726882902.84985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 33932 1726882902.85042: variable 'ansible_facts' from source: unknown 33932 1726882902.85246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097/AnsiballZ_package_facts.py 33932 1726882902.85393: Sending initial data 33932 1726882902.85396: Sent initial data (162 bytes) 33932 1726882902.86311: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882902.86317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.86327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.86339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.86378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.86384: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882902.86393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.86404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882902.86411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882902.86417: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882902.86424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.86432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.86442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.86450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.86456: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882902.86465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.86539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882902.86552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882902.86563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882902.86683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882902.88404: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882902.88495: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882902.88590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpa3gpgm8u /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097/AnsiballZ_package_facts.py <<< 33932 1726882902.88687: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882902.91278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882902.91353: stderr chunk (state=3): >>><<< 33932 1726882902.91356: stdout chunk (state=3): >>><<< 33932 1726882902.91380: done transferring module to remote 33932 1726882902.91390: _low_level_execute_command(): starting 33932 1726882902.91395: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097/ /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097/AnsiballZ_package_facts.py && sleep 0' 33932 1726882902.92031: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882902.92040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.92050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.92066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.92106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.92114: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882902.92123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.92137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882902.92144: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882902.92150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882902.92158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.92171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.92185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.92193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.92200: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882902.92209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.92284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882902.92297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882902.92307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882902.92426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882902.94157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882902.94216: stderr chunk (state=3): >>><<< 33932 1726882902.94227: stdout chunk (state=3): >>><<< 33932 1726882902.94314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882902.94317: _low_level_execute_command(): starting 33932 1726882902.94320: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097/AnsiballZ_package_facts.py && sleep 0' 33932 1726882902.94842: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882902.94855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.94871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.94887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.94926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.94938: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882902.94951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.94971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882902.94982: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882902.94992: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882902.95002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882902.95014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882902.95027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882902.95037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882902.95047: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882902.95059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882902.95137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882902.95152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882902.95170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882902.95303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882903.41870: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 33932 1726882903.41931: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 33932 1726882903.41962: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 33932 1726882903.42007: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 33932 1726882903.42020: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_c<<< 33932 1726882903.42026: stdout chunk (state=3): >>>tl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "s<<< 33932 1726882903.42039: stdout chunk (state=3): >>>ource": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List<<< 33932 1726882903.42052: stdout chunk (state=3): >>>-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "opens<<< 33932 1726882903.42083: stdout chunk (state=3): >>>sl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2<<< 33932 1726882903.42098: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source":<<< 33932 1726882903.42101: stdout chunk (state=3): >>> "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 33932 1726882903.43582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882903.43661: stderr chunk (state=3): >>><<< 33932 1726882903.43665: stdout chunk (state=3): >>><<< 33932 1726882903.43919: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882903.45750: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882903.45773: _low_level_execute_command(): starting 33932 1726882903.45776: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882902.8175514-35005-237830775297097/ > /dev/null 2>&1 && sleep 0' 33932 1726882903.46216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882903.46221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882903.46253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882903.46266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882903.46318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882903.46329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882903.46434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882903.48272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882903.48316: stderr chunk (state=3): >>><<< 33932 1726882903.48319: stdout chunk (state=3): >>><<< 33932 1726882903.48333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882903.48338: handler run complete 33932 1726882903.48834: variable 'ansible_facts' from source: unknown 33932 1726882903.49132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.50322: variable 'ansible_facts' from source: unknown 33932 1726882903.50641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.51086: attempt loop complete, returning result 33932 1726882903.51096: _execute() done 33932 1726882903.51099: dumping result to json 33932 1726882903.51224: done dumping result, returning 33932 1726882903.51234: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-615b-5c48-000000000889] 33932 1726882903.51237: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000889 33932 1726882903.52540: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000889 33932 1726882903.52544: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882903.52636: no more pending results, returning what we have 33932 1726882903.52638: results queue empty 33932 1726882903.52639: checking for any_errors_fatal 33932 1726882903.52643: done checking for any_errors_fatal 33932 1726882903.52644: checking for max_fail_percentage 33932 1726882903.52645: done checking for max_fail_percentage 33932 1726882903.52645: checking to see if all hosts have failed and the running result is not ok 33932 1726882903.52646: done checking to see if all hosts have failed 33932 1726882903.52646: getting the remaining hosts for this loop 33932 1726882903.52647: done getting the remaining hosts for this loop 33932 1726882903.52650: getting the next task for host managed_node1 33932 1726882903.52654: done getting next task for host managed_node1 33932 1726882903.52656: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33932 1726882903.52658: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882903.52667: getting variables 33932 1726882903.52670: in VariableManager get_vars() 33932 1726882903.52694: Calling all_inventory to load vars for managed_node1 33932 1726882903.52696: Calling groups_inventory to load vars for managed_node1 33932 1726882903.52698: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882903.52706: Calling all_plugins_play to load vars for managed_node1 33932 1726882903.52708: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882903.52710: Calling groups_plugins_play to load vars for managed_node1 33932 1726882903.53473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.55057: done with get_vars() 33932 1726882903.55077: done getting variables 33932 1726882903.55120: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:43 -0400 (0:00:00.786) 0:00:24.019 ****** 33932 1726882903.55143: entering _queue_task() for managed_node1/debug 33932 1726882903.55372: worker is 1 (out of 1 available) 33932 1726882903.55384: exiting _queue_task() for managed_node1/debug 33932 1726882903.55395: done queuing things up, now waiting for results queue to drain 33932 1726882903.55397: waiting for pending results... 33932 1726882903.55578: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 33932 1726882903.55661: in run() - task 0e448fcc-3ce9-615b-5c48-000000000066 33932 1726882903.55676: variable 'ansible_search_path' from source: unknown 33932 1726882903.55679: variable 'ansible_search_path' from source: unknown 33932 1726882903.55708: calling self._execute() 33932 1726882903.55787: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.55791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.55798: variable 'omit' from source: magic vars 33932 1726882903.56072: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.56082: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882903.56088: variable 'omit' from source: magic vars 33932 1726882903.56123: variable 'omit' from source: magic vars 33932 1726882903.56195: variable 'network_provider' from source: set_fact 33932 1726882903.56207: variable 'omit' from source: magic vars 33932 1726882903.56238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882903.56275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882903.56293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882903.56308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882903.56317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882903.56339: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882903.56344: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.56347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.56421: Set connection var ansible_shell_executable to /bin/sh 33932 1726882903.56427: Set connection var ansible_timeout to 10 33932 1726882903.56432: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882903.56437: Set connection var ansible_pipelining to False 33932 1726882903.56440: Set connection var ansible_connection to ssh 33932 1726882903.56442: Set connection var ansible_shell_type to sh 33932 1726882903.56460: variable 'ansible_shell_executable' from source: unknown 33932 1726882903.56467: variable 'ansible_connection' from source: unknown 33932 1726882903.56473: variable 'ansible_module_compression' from source: unknown 33932 1726882903.56478: variable 'ansible_shell_type' from source: unknown 33932 1726882903.56481: variable 'ansible_shell_executable' from source: unknown 33932 1726882903.56483: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.56489: variable 'ansible_pipelining' from source: unknown 33932 1726882903.56494: variable 'ansible_timeout' from source: unknown 33932 1726882903.56498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.56594: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882903.56606: variable 'omit' from source: magic vars 33932 1726882903.56610: starting attempt loop 33932 1726882903.56613: running the handler 33932 1726882903.56651: handler run complete 33932 1726882903.56662: attempt loop complete, returning result 33932 1726882903.56670: _execute() done 33932 1726882903.56673: dumping result to json 33932 1726882903.56678: done dumping result, returning 33932 1726882903.56681: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-615b-5c48-000000000066] 33932 1726882903.56686: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000066 33932 1726882903.56757: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000066 33932 1726882903.56761: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 33932 1726882903.56960: no more pending results, returning what we have 33932 1726882903.56963: results queue empty 33932 1726882903.56966: checking for any_errors_fatal 33932 1726882903.56974: done checking for any_errors_fatal 33932 1726882903.56975: checking for max_fail_percentage 33932 1726882903.56977: done checking for max_fail_percentage 33932 1726882903.56978: checking to see if all hosts have failed and the running result is not ok 33932 1726882903.56979: done checking to see if all hosts have failed 33932 1726882903.56980: getting the remaining hosts for this loop 33932 1726882903.56982: done getting the remaining hosts for this loop 33932 1726882903.56985: getting the next task for host managed_node1 33932 1726882903.56992: done getting next task for host managed_node1 33932 1726882903.56996: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33932 1726882903.56999: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882903.57017: getting variables 33932 1726882903.57019: in VariableManager get_vars() 33932 1726882903.57056: Calling all_inventory to load vars for managed_node1 33932 1726882903.57059: Calling groups_inventory to load vars for managed_node1 33932 1726882903.57062: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882903.57074: Calling all_plugins_play to load vars for managed_node1 33932 1726882903.57079: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882903.57082: Calling groups_plugins_play to load vars for managed_node1 33932 1726882903.58857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.60585: done with get_vars() 33932 1726882903.60608: done getting variables 33932 1726882903.60667: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:43 -0400 (0:00:00.055) 0:00:24.074 ****** 33932 1726882903.60702: entering _queue_task() for managed_node1/fail 33932 1726882903.60976: worker is 1 (out of 1 available) 33932 1726882903.60988: exiting _queue_task() for managed_node1/fail 33932 1726882903.60998: done queuing things up, now waiting for results queue to drain 33932 1726882903.60999: waiting for pending results... 33932 1726882903.61300: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33932 1726882903.61446: in run() - task 0e448fcc-3ce9-615b-5c48-000000000067 33932 1726882903.61468: variable 'ansible_search_path' from source: unknown 33932 1726882903.61478: variable 'ansible_search_path' from source: unknown 33932 1726882903.61522: calling self._execute() 33932 1726882903.61630: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.61642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.61661: variable 'omit' from source: magic vars 33932 1726882903.62067: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.62087: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882903.62218: variable 'network_state' from source: role '' defaults 33932 1726882903.62233: Evaluated conditional (network_state != {}): False 33932 1726882903.62241: when evaluation is False, skipping this task 33932 1726882903.62248: _execute() done 33932 1726882903.62256: dumping result to json 33932 1726882903.62265: done dumping result, returning 33932 1726882903.62276: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-615b-5c48-000000000067] 33932 1726882903.62289: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000067 33932 1726882903.62402: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000067 33932 1726882903.62411: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882903.62467: no more pending results, returning what we have 33932 1726882903.62471: results queue empty 33932 1726882903.62472: checking for any_errors_fatal 33932 1726882903.62480: done checking for any_errors_fatal 33932 1726882903.62481: checking for max_fail_percentage 33932 1726882903.62483: done checking for max_fail_percentage 33932 1726882903.62484: checking to see if all hosts have failed and the running result is not ok 33932 1726882903.62485: done checking to see if all hosts have failed 33932 1726882903.62485: getting the remaining hosts for this loop 33932 1726882903.62487: done getting the remaining hosts for this loop 33932 1726882903.62492: getting the next task for host managed_node1 33932 1726882903.62499: done getting next task for host managed_node1 33932 1726882903.62503: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33932 1726882903.62507: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882903.62527: getting variables 33932 1726882903.62529: in VariableManager get_vars() 33932 1726882903.62571: Calling all_inventory to load vars for managed_node1 33932 1726882903.62574: Calling groups_inventory to load vars for managed_node1 33932 1726882903.62577: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882903.62590: Calling all_plugins_play to load vars for managed_node1 33932 1726882903.62593: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882903.62597: Calling groups_plugins_play to load vars for managed_node1 33932 1726882903.64374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.66083: done with get_vars() 33932 1726882903.66108: done getting variables 33932 1726882903.66170: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:43 -0400 (0:00:00.054) 0:00:24.129 ****** 33932 1726882903.66203: entering _queue_task() for managed_node1/fail 33932 1726882903.66505: worker is 1 (out of 1 available) 33932 1726882903.66517: exiting _queue_task() for managed_node1/fail 33932 1726882903.66530: done queuing things up, now waiting for results queue to drain 33932 1726882903.66532: waiting for pending results... 33932 1726882903.66814: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33932 1726882903.66953: in run() - task 0e448fcc-3ce9-615b-5c48-000000000068 33932 1726882903.66974: variable 'ansible_search_path' from source: unknown 33932 1726882903.66981: variable 'ansible_search_path' from source: unknown 33932 1726882903.67022: calling self._execute() 33932 1726882903.67118: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.67127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.67141: variable 'omit' from source: magic vars 33932 1726882903.67545: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.67566: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882903.67692: variable 'network_state' from source: role '' defaults 33932 1726882903.67708: Evaluated conditional (network_state != {}): False 33932 1726882903.67716: when evaluation is False, skipping this task 33932 1726882903.67726: _execute() done 33932 1726882903.67737: dumping result to json 33932 1726882903.67745: done dumping result, returning 33932 1726882903.67756: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-615b-5c48-000000000068] 33932 1726882903.67768: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000068 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882903.67919: no more pending results, returning what we have 33932 1726882903.67923: results queue empty 33932 1726882903.67924: checking for any_errors_fatal 33932 1726882903.67932: done checking for any_errors_fatal 33932 1726882903.67933: checking for max_fail_percentage 33932 1726882903.67935: done checking for max_fail_percentage 33932 1726882903.67936: checking to see if all hosts have failed and the running result is not ok 33932 1726882903.67937: done checking to see if all hosts have failed 33932 1726882903.67938: getting the remaining hosts for this loop 33932 1726882903.67940: done getting the remaining hosts for this loop 33932 1726882903.67944: getting the next task for host managed_node1 33932 1726882903.67950: done getting next task for host managed_node1 33932 1726882903.67955: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33932 1726882903.67959: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882903.67982: getting variables 33932 1726882903.67985: in VariableManager get_vars() 33932 1726882903.68029: Calling all_inventory to load vars for managed_node1 33932 1726882903.68032: Calling groups_inventory to load vars for managed_node1 33932 1726882903.68035: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882903.68048: Calling all_plugins_play to load vars for managed_node1 33932 1726882903.68052: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882903.68055: Calling groups_plugins_play to load vars for managed_node1 33932 1726882903.69683: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000068 33932 1726882903.69687: WORKER PROCESS EXITING 33932 1726882903.69773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.71553: done with get_vars() 33932 1726882903.71577: done getting variables 33932 1726882903.71636: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:43 -0400 (0:00:00.054) 0:00:24.184 ****** 33932 1726882903.71671: entering _queue_task() for managed_node1/fail 33932 1726882903.71951: worker is 1 (out of 1 available) 33932 1726882903.71964: exiting _queue_task() for managed_node1/fail 33932 1726882903.71976: done queuing things up, now waiting for results queue to drain 33932 1726882903.71978: waiting for pending results... 33932 1726882903.72263: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33932 1726882903.72414: in run() - task 0e448fcc-3ce9-615b-5c48-000000000069 33932 1726882903.72434: variable 'ansible_search_path' from source: unknown 33932 1726882903.72441: variable 'ansible_search_path' from source: unknown 33932 1726882903.72483: calling self._execute() 33932 1726882903.72584: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.72595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.72610: variable 'omit' from source: magic vars 33932 1726882903.72964: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.72983: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882903.73147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882903.75549: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882903.75625: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882903.75668: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882903.75710: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882903.75738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882903.75820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882903.75851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882903.75887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882903.75935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882903.75954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882903.76067: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.76088: Evaluated conditional (ansible_distribution_major_version | int > 9): False 33932 1726882903.76099: when evaluation is False, skipping this task 33932 1726882903.76106: _execute() done 33932 1726882903.76113: dumping result to json 33932 1726882903.76119: done dumping result, returning 33932 1726882903.76129: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-615b-5c48-000000000069] 33932 1726882903.76138: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 33932 1726882903.76292: no more pending results, returning what we have 33932 1726882903.76296: results queue empty 33932 1726882903.76298: checking for any_errors_fatal 33932 1726882903.76305: done checking for any_errors_fatal 33932 1726882903.76305: checking for max_fail_percentage 33932 1726882903.76308: done checking for max_fail_percentage 33932 1726882903.76309: checking to see if all hosts have failed and the running result is not ok 33932 1726882903.76310: done checking to see if all hosts have failed 33932 1726882903.76311: getting the remaining hosts for this loop 33932 1726882903.76313: done getting the remaining hosts for this loop 33932 1726882903.76317: getting the next task for host managed_node1 33932 1726882903.76325: done getting next task for host managed_node1 33932 1726882903.76329: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33932 1726882903.76332: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882903.76352: getting variables 33932 1726882903.76354: in VariableManager get_vars() 33932 1726882903.76401: Calling all_inventory to load vars for managed_node1 33932 1726882903.76404: Calling groups_inventory to load vars for managed_node1 33932 1726882903.76407: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882903.76420: Calling all_plugins_play to load vars for managed_node1 33932 1726882903.76423: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882903.76426: Calling groups_plugins_play to load vars for managed_node1 33932 1726882903.77586: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000069 33932 1726882903.77590: WORKER PROCESS EXITING 33932 1726882903.78082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.79782: done with get_vars() 33932 1726882903.79811: done getting variables 33932 1726882903.79873: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:43 -0400 (0:00:00.082) 0:00:24.266 ****** 33932 1726882903.79909: entering _queue_task() for managed_node1/dnf 33932 1726882903.80224: worker is 1 (out of 1 available) 33932 1726882903.80238: exiting _queue_task() for managed_node1/dnf 33932 1726882903.80250: done queuing things up, now waiting for results queue to drain 33932 1726882903.80251: waiting for pending results... 33932 1726882903.80542: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33932 1726882903.80687: in run() - task 0e448fcc-3ce9-615b-5c48-00000000006a 33932 1726882903.80710: variable 'ansible_search_path' from source: unknown 33932 1726882903.80719: variable 'ansible_search_path' from source: unknown 33932 1726882903.80760: calling self._execute() 33932 1726882903.80862: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.80877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.80892: variable 'omit' from source: magic vars 33932 1726882903.81261: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.81281: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882903.81488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882903.84135: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882903.84206: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882903.84245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882903.84288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882903.84317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882903.84399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882903.84429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882903.84459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882903.84510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882903.84528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882903.84648: variable 'ansible_distribution' from source: facts 33932 1726882903.84657: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.84679: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 33932 1726882903.84797: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882903.84935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882903.84962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882903.84993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882903.85041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882903.85059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882903.85104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882903.85131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882903.85162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882903.85208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882903.85227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882903.85273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882903.85301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882903.85331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882903.85380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882903.85397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882903.85553: variable 'network_connections' from source: task vars 33932 1726882903.85572: variable 'interface' from source: play vars 33932 1726882903.85635: variable 'interface' from source: play vars 33932 1726882903.85652: variable 'vlan_interface' from source: play vars 33932 1726882903.85720: variable 'vlan_interface' from source: play vars 33932 1726882903.85796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882903.85962: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882903.86007: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882903.86038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882903.86074: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882903.86129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882903.86156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882903.86197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882903.86233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882903.86290: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882903.86530: variable 'network_connections' from source: task vars 33932 1726882903.86541: variable 'interface' from source: play vars 33932 1726882903.86609: variable 'interface' from source: play vars 33932 1726882903.86622: variable 'vlan_interface' from source: play vars 33932 1726882903.86691: variable 'vlan_interface' from source: play vars 33932 1726882903.86720: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 33932 1726882903.86728: when evaluation is False, skipping this task 33932 1726882903.86735: _execute() done 33932 1726882903.86742: dumping result to json 33932 1726882903.86749: done dumping result, returning 33932 1726882903.86760: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-00000000006a] 33932 1726882903.86777: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006a skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 33932 1726882903.86932: no more pending results, returning what we have 33932 1726882903.86936: results queue empty 33932 1726882903.86937: checking for any_errors_fatal 33932 1726882903.86944: done checking for any_errors_fatal 33932 1726882903.86945: checking for max_fail_percentage 33932 1726882903.86947: done checking for max_fail_percentage 33932 1726882903.86948: checking to see if all hosts have failed and the running result is not ok 33932 1726882903.86948: done checking to see if all hosts have failed 33932 1726882903.86949: getting the remaining hosts for this loop 33932 1726882903.86951: done getting the remaining hosts for this loop 33932 1726882903.86955: getting the next task for host managed_node1 33932 1726882903.86962: done getting next task for host managed_node1 33932 1726882903.86968: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33932 1726882903.86971: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882903.86991: getting variables 33932 1726882903.86993: in VariableManager get_vars() 33932 1726882903.87039: Calling all_inventory to load vars for managed_node1 33932 1726882903.87042: Calling groups_inventory to load vars for managed_node1 33932 1726882903.87045: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882903.87056: Calling all_plugins_play to load vars for managed_node1 33932 1726882903.87059: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882903.87063: Calling groups_plugins_play to load vars for managed_node1 33932 1726882903.88082: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006a 33932 1726882903.88085: WORKER PROCESS EXITING 33932 1726882903.88909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.90603: done with get_vars() 33932 1726882903.90625: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33932 1726882903.90700: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:43 -0400 (0:00:00.108) 0:00:24.374 ****** 33932 1726882903.90731: entering _queue_task() for managed_node1/yum 33932 1726882903.91016: worker is 1 (out of 1 available) 33932 1726882903.91027: exiting _queue_task() for managed_node1/yum 33932 1726882903.91037: done queuing things up, now waiting for results queue to drain 33932 1726882903.91039: waiting for pending results... 33932 1726882903.91309: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33932 1726882903.91448: in run() - task 0e448fcc-3ce9-615b-5c48-00000000006b 33932 1726882903.91472: variable 'ansible_search_path' from source: unknown 33932 1726882903.91488: variable 'ansible_search_path' from source: unknown 33932 1726882903.91528: calling self._execute() 33932 1726882903.91632: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882903.91644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882903.91657: variable 'omit' from source: magic vars 33932 1726882903.92036: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.92053: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882903.92232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882903.94653: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882903.94726: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882903.94771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882903.94810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882903.94844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882903.94925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882903.94965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882903.94997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882903.95041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882903.95065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882903.95165: variable 'ansible_distribution_major_version' from source: facts 33932 1726882903.95188: Evaluated conditional (ansible_distribution_major_version | int < 8): False 33932 1726882903.95195: when evaluation is False, skipping this task 33932 1726882903.95201: _execute() done 33932 1726882903.95208: dumping result to json 33932 1726882903.95215: done dumping result, returning 33932 1726882903.95225: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-00000000006b] 33932 1726882903.95233: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 33932 1726882903.95383: no more pending results, returning what we have 33932 1726882903.95387: results queue empty 33932 1726882903.95388: checking for any_errors_fatal 33932 1726882903.95395: done checking for any_errors_fatal 33932 1726882903.95395: checking for max_fail_percentage 33932 1726882903.95397: done checking for max_fail_percentage 33932 1726882903.95398: checking to see if all hosts have failed and the running result is not ok 33932 1726882903.95399: done checking to see if all hosts have failed 33932 1726882903.95400: getting the remaining hosts for this loop 33932 1726882903.95402: done getting the remaining hosts for this loop 33932 1726882903.95406: getting the next task for host managed_node1 33932 1726882903.95412: done getting next task for host managed_node1 33932 1726882903.95416: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33932 1726882903.95419: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882903.95436: getting variables 33932 1726882903.95438: in VariableManager get_vars() 33932 1726882903.95480: Calling all_inventory to load vars for managed_node1 33932 1726882903.95483: Calling groups_inventory to load vars for managed_node1 33932 1726882903.95485: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882903.95496: Calling all_plugins_play to load vars for managed_node1 33932 1726882903.95499: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882903.95501: Calling groups_plugins_play to load vars for managed_node1 33932 1726882903.96483: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006b 33932 1726882903.96486: WORKER PROCESS EXITING 33932 1726882903.97225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882903.99033: done with get_vars() 33932 1726882903.99057: done getting variables 33932 1726882903.99122: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:43 -0400 (0:00:00.084) 0:00:24.459 ****** 33932 1726882903.99158: entering _queue_task() for managed_node1/fail 33932 1726882903.99502: worker is 1 (out of 1 available) 33932 1726882903.99516: exiting _queue_task() for managed_node1/fail 33932 1726882903.99528: done queuing things up, now waiting for results queue to drain 33932 1726882903.99529: waiting for pending results... 33932 1726882903.99821: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33932 1726882903.99969: in run() - task 0e448fcc-3ce9-615b-5c48-00000000006c 33932 1726882903.99991: variable 'ansible_search_path' from source: unknown 33932 1726882903.99999: variable 'ansible_search_path' from source: unknown 33932 1726882904.00041: calling self._execute() 33932 1726882904.00143: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.00155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.00172: variable 'omit' from source: magic vars 33932 1726882904.00551: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.00573: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882904.00695: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882904.00897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882904.03247: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882904.03320: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882904.03362: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882904.03404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882904.03434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882904.03520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.03555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.03589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.03634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.03651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.03703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.03730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.03760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.03807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.03826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.03871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.03903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.03930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.03974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.03995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.04170: variable 'network_connections' from source: task vars 33932 1726882904.04186: variable 'interface' from source: play vars 33932 1726882904.04257: variable 'interface' from source: play vars 33932 1726882904.04275: variable 'vlan_interface' from source: play vars 33932 1726882904.04343: variable 'vlan_interface' from source: play vars 33932 1726882904.04417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882904.04599: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882904.04643: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882904.04680: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882904.04712: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882904.04761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882904.04789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882904.04817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.04845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882904.04903: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882904.05145: variable 'network_connections' from source: task vars 33932 1726882904.05155: variable 'interface' from source: play vars 33932 1726882904.05224: variable 'interface' from source: play vars 33932 1726882904.05236: variable 'vlan_interface' from source: play vars 33932 1726882904.05302: variable 'vlan_interface' from source: play vars 33932 1726882904.05329: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 33932 1726882904.05336: when evaluation is False, skipping this task 33932 1726882904.05343: _execute() done 33932 1726882904.05349: dumping result to json 33932 1726882904.05356: done dumping result, returning 33932 1726882904.05368: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-00000000006c] 33932 1726882904.05385: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006c skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 33932 1726882904.05552: no more pending results, returning what we have 33932 1726882904.05557: results queue empty 33932 1726882904.05558: checking for any_errors_fatal 33932 1726882904.05567: done checking for any_errors_fatal 33932 1726882904.05567: checking for max_fail_percentage 33932 1726882904.05570: done checking for max_fail_percentage 33932 1726882904.05571: checking to see if all hosts have failed and the running result is not ok 33932 1726882904.05572: done checking to see if all hosts have failed 33932 1726882904.05572: getting the remaining hosts for this loop 33932 1726882904.05574: done getting the remaining hosts for this loop 33932 1726882904.05579: getting the next task for host managed_node1 33932 1726882904.05585: done getting next task for host managed_node1 33932 1726882904.05590: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33932 1726882904.05593: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882904.05610: getting variables 33932 1726882904.05613: in VariableManager get_vars() 33932 1726882904.05655: Calling all_inventory to load vars for managed_node1 33932 1726882904.05657: Calling groups_inventory to load vars for managed_node1 33932 1726882904.05660: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882904.05673: Calling all_plugins_play to load vars for managed_node1 33932 1726882904.05677: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882904.05681: Calling groups_plugins_play to load vars for managed_node1 33932 1726882904.06680: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006c 33932 1726882904.06684: WORKER PROCESS EXITING 33932 1726882904.07422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882904.09090: done with get_vars() 33932 1726882904.09116: done getting variables 33932 1726882904.09177: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:44 -0400 (0:00:00.100) 0:00:24.559 ****** 33932 1726882904.09213: entering _queue_task() for managed_node1/package 33932 1726882904.09510: worker is 1 (out of 1 available) 33932 1726882904.09522: exiting _queue_task() for managed_node1/package 33932 1726882904.09533: done queuing things up, now waiting for results queue to drain 33932 1726882904.09534: waiting for pending results... 33932 1726882904.09804: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 33932 1726882904.09950: in run() - task 0e448fcc-3ce9-615b-5c48-00000000006d 33932 1726882904.09973: variable 'ansible_search_path' from source: unknown 33932 1726882904.09980: variable 'ansible_search_path' from source: unknown 33932 1726882904.10019: calling self._execute() 33932 1726882904.10117: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.10127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.10140: variable 'omit' from source: magic vars 33932 1726882904.10501: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.10522: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882904.10711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882904.10977: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882904.11026: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882904.11069: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882904.11108: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882904.11216: variable 'network_packages' from source: role '' defaults 33932 1726882904.11325: variable '__network_provider_setup' from source: role '' defaults 33932 1726882904.11340: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882904.11409: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882904.11421: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882904.11516: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882904.11707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882904.13849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882904.13931: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882904.13976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882904.14015: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882904.14045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882904.14132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.14166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.14201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.14247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.14270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.14323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.14350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.14381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.14430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.14449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.14658: variable '__network_packages_default_gobject_packages' from source: role '' defaults 33932 1726882904.14775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.14801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.14825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.14875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.14893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.14987: variable 'ansible_python' from source: facts 33932 1726882904.15016: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 33932 1726882904.15103: variable '__network_wpa_supplicant_required' from source: role '' defaults 33932 1726882904.15190: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 33932 1726882904.15319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.15346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.15378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.15425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.15442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.15495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.15532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.15559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.15604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.15627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.15775: variable 'network_connections' from source: task vars 33932 1726882904.15786: variable 'interface' from source: play vars 33932 1726882904.15891: variable 'interface' from source: play vars 33932 1726882904.15906: variable 'vlan_interface' from source: play vars 33932 1726882904.16010: variable 'vlan_interface' from source: play vars 33932 1726882904.16084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882904.16114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882904.16145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.16186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882904.16236: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882904.16523: variable 'network_connections' from source: task vars 33932 1726882904.16532: variable 'interface' from source: play vars 33932 1726882904.16635: variable 'interface' from source: play vars 33932 1726882904.16649: variable 'vlan_interface' from source: play vars 33932 1726882904.16751: variable 'vlan_interface' from source: play vars 33932 1726882904.16787: variable '__network_packages_default_wireless' from source: role '' defaults 33932 1726882904.16872: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882904.17185: variable 'network_connections' from source: task vars 33932 1726882904.17194: variable 'interface' from source: play vars 33932 1726882904.17262: variable 'interface' from source: play vars 33932 1726882904.17276: variable 'vlan_interface' from source: play vars 33932 1726882904.17341: variable 'vlan_interface' from source: play vars 33932 1726882904.17370: variable '__network_packages_default_team' from source: role '' defaults 33932 1726882904.17451: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882904.18118: variable 'network_connections' from source: task vars 33932 1726882904.18128: variable 'interface' from source: play vars 33932 1726882904.18196: variable 'interface' from source: play vars 33932 1726882904.18210: variable 'vlan_interface' from source: play vars 33932 1726882904.18280: variable 'vlan_interface' from source: play vars 33932 1726882904.18344: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882904.18408: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882904.18420: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882904.18488: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882904.18714: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 33932 1726882904.19210: variable 'network_connections' from source: task vars 33932 1726882904.19220: variable 'interface' from source: play vars 33932 1726882904.19283: variable 'interface' from source: play vars 33932 1726882904.19300: variable 'vlan_interface' from source: play vars 33932 1726882904.19362: variable 'vlan_interface' from source: play vars 33932 1726882904.19377: variable 'ansible_distribution' from source: facts 33932 1726882904.19386: variable '__network_rh_distros' from source: role '' defaults 33932 1726882904.19394: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.19416: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 33932 1726882904.19587: variable 'ansible_distribution' from source: facts 33932 1726882904.19596: variable '__network_rh_distros' from source: role '' defaults 33932 1726882904.19605: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.19626: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 33932 1726882904.19797: variable 'ansible_distribution' from source: facts 33932 1726882904.19805: variable '__network_rh_distros' from source: role '' defaults 33932 1726882904.19814: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.19857: variable 'network_provider' from source: set_fact 33932 1726882904.19879: variable 'ansible_facts' from source: unknown 33932 1726882904.25467: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 33932 1726882904.25479: when evaluation is False, skipping this task 33932 1726882904.25486: _execute() done 33932 1726882904.25497: dumping result to json 33932 1726882904.25505: done dumping result, returning 33932 1726882904.25515: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-615b-5c48-00000000006d] 33932 1726882904.25524: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006d skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 33932 1726882904.25671: no more pending results, returning what we have 33932 1726882904.25675: results queue empty 33932 1726882904.25676: checking for any_errors_fatal 33932 1726882904.25682: done checking for any_errors_fatal 33932 1726882904.25683: checking for max_fail_percentage 33932 1726882904.25685: done checking for max_fail_percentage 33932 1726882904.25686: checking to see if all hosts have failed and the running result is not ok 33932 1726882904.25687: done checking to see if all hosts have failed 33932 1726882904.25688: getting the remaining hosts for this loop 33932 1726882904.25689: done getting the remaining hosts for this loop 33932 1726882904.25693: getting the next task for host managed_node1 33932 1726882904.25698: done getting next task for host managed_node1 33932 1726882904.25703: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33932 1726882904.25706: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882904.25724: getting variables 33932 1726882904.25725: in VariableManager get_vars() 33932 1726882904.25772: Calling all_inventory to load vars for managed_node1 33932 1726882904.25775: Calling groups_inventory to load vars for managed_node1 33932 1726882904.25777: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882904.25787: Calling all_plugins_play to load vars for managed_node1 33932 1726882904.25790: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882904.25793: Calling groups_plugins_play to load vars for managed_node1 33932 1726882904.26783: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006d 33932 1726882904.26787: WORKER PROCESS EXITING 33932 1726882904.27706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882904.33785: done with get_vars() 33932 1726882904.33810: done getting variables 33932 1726882904.33859: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:44 -0400 (0:00:00.246) 0:00:24.806 ****** 33932 1726882904.33890: entering _queue_task() for managed_node1/package 33932 1726882904.34201: worker is 1 (out of 1 available) 33932 1726882904.34215: exiting _queue_task() for managed_node1/package 33932 1726882904.34227: done queuing things up, now waiting for results queue to drain 33932 1726882904.34229: waiting for pending results... 33932 1726882904.34408: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33932 1726882904.34499: in run() - task 0e448fcc-3ce9-615b-5c48-00000000006e 33932 1726882904.34510: variable 'ansible_search_path' from source: unknown 33932 1726882904.34515: variable 'ansible_search_path' from source: unknown 33932 1726882904.34543: calling self._execute() 33932 1726882904.34617: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.34624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.34631: variable 'omit' from source: magic vars 33932 1726882904.34912: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.34923: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882904.35007: variable 'network_state' from source: role '' defaults 33932 1726882904.35016: Evaluated conditional (network_state != {}): False 33932 1726882904.35020: when evaluation is False, skipping this task 33932 1726882904.35023: _execute() done 33932 1726882904.35026: dumping result to json 33932 1726882904.35029: done dumping result, returning 33932 1726882904.35034: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-615b-5c48-00000000006e] 33932 1726882904.35040: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006e 33932 1726882904.35133: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006e 33932 1726882904.35137: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882904.35189: no more pending results, returning what we have 33932 1726882904.35193: results queue empty 33932 1726882904.35194: checking for any_errors_fatal 33932 1726882904.35204: done checking for any_errors_fatal 33932 1726882904.35205: checking for max_fail_percentage 33932 1726882904.35206: done checking for max_fail_percentage 33932 1726882904.35207: checking to see if all hosts have failed and the running result is not ok 33932 1726882904.35208: done checking to see if all hosts have failed 33932 1726882904.35209: getting the remaining hosts for this loop 33932 1726882904.35210: done getting the remaining hosts for this loop 33932 1726882904.35214: getting the next task for host managed_node1 33932 1726882904.35219: done getting next task for host managed_node1 33932 1726882904.35223: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33932 1726882904.35226: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882904.35241: getting variables 33932 1726882904.35242: in VariableManager get_vars() 33932 1726882904.35283: Calling all_inventory to load vars for managed_node1 33932 1726882904.35286: Calling groups_inventory to load vars for managed_node1 33932 1726882904.35288: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882904.35297: Calling all_plugins_play to load vars for managed_node1 33932 1726882904.35299: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882904.35301: Calling groups_plugins_play to load vars for managed_node1 33932 1726882904.36325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882904.37671: done with get_vars() 33932 1726882904.37686: done getting variables 33932 1726882904.37726: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:44 -0400 (0:00:00.038) 0:00:24.845 ****** 33932 1726882904.37749: entering _queue_task() for managed_node1/package 33932 1726882904.37956: worker is 1 (out of 1 available) 33932 1726882904.37974: exiting _queue_task() for managed_node1/package 33932 1726882904.37988: done queuing things up, now waiting for results queue to drain 33932 1726882904.37990: waiting for pending results... 33932 1726882904.38150: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33932 1726882904.38235: in run() - task 0e448fcc-3ce9-615b-5c48-00000000006f 33932 1726882904.38246: variable 'ansible_search_path' from source: unknown 33932 1726882904.38250: variable 'ansible_search_path' from source: unknown 33932 1726882904.38282: calling self._execute() 33932 1726882904.38355: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.38358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.38370: variable 'omit' from source: magic vars 33932 1726882904.38636: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.38646: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882904.38732: variable 'network_state' from source: role '' defaults 33932 1726882904.38740: Evaluated conditional (network_state != {}): False 33932 1726882904.38743: when evaluation is False, skipping this task 33932 1726882904.38747: _execute() done 33932 1726882904.38750: dumping result to json 33932 1726882904.38752: done dumping result, returning 33932 1726882904.38760: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-615b-5c48-00000000006f] 33932 1726882904.38765: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006f 33932 1726882904.38853: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000006f 33932 1726882904.38855: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882904.38908: no more pending results, returning what we have 33932 1726882904.38911: results queue empty 33932 1726882904.38912: checking for any_errors_fatal 33932 1726882904.38917: done checking for any_errors_fatal 33932 1726882904.38918: checking for max_fail_percentage 33932 1726882904.38920: done checking for max_fail_percentage 33932 1726882904.38920: checking to see if all hosts have failed and the running result is not ok 33932 1726882904.38921: done checking to see if all hosts have failed 33932 1726882904.38922: getting the remaining hosts for this loop 33932 1726882904.38923: done getting the remaining hosts for this loop 33932 1726882904.38926: getting the next task for host managed_node1 33932 1726882904.38931: done getting next task for host managed_node1 33932 1726882904.38934: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33932 1726882904.38937: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882904.38951: getting variables 33932 1726882904.38953: in VariableManager get_vars() 33932 1726882904.39009: Calling all_inventory to load vars for managed_node1 33932 1726882904.39012: Calling groups_inventory to load vars for managed_node1 33932 1726882904.39014: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882904.39020: Calling all_plugins_play to load vars for managed_node1 33932 1726882904.39022: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882904.39024: Calling groups_plugins_play to load vars for managed_node1 33932 1726882904.40589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882904.41674: done with get_vars() 33932 1726882904.41691: done getting variables 33932 1726882904.41730: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:44 -0400 (0:00:00.040) 0:00:24.885 ****** 33932 1726882904.41753: entering _queue_task() for managed_node1/service 33932 1726882904.41952: worker is 1 (out of 1 available) 33932 1726882904.41967: exiting _queue_task() for managed_node1/service 33932 1726882904.41982: done queuing things up, now waiting for results queue to drain 33932 1726882904.41985: waiting for pending results... 33932 1726882904.42148: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33932 1726882904.42232: in run() - task 0e448fcc-3ce9-615b-5c48-000000000070 33932 1726882904.42245: variable 'ansible_search_path' from source: unknown 33932 1726882904.42249: variable 'ansible_search_path' from source: unknown 33932 1726882904.42279: calling self._execute() 33932 1726882904.42356: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.42360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.42372: variable 'omit' from source: magic vars 33932 1726882904.42635: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.42646: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882904.42754: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882904.42926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882904.45083: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882904.45152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882904.45191: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882904.45224: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882904.45249: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882904.45800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.45885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.45917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.45977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.46008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.46073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.46104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.46134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.46195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.46215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.46265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.46298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.46329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.46386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.46407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.46607: variable 'network_connections' from source: task vars 33932 1726882904.46624: variable 'interface' from source: play vars 33932 1726882904.46711: variable 'interface' from source: play vars 33932 1726882904.46727: variable 'vlan_interface' from source: play vars 33932 1726882904.46795: variable 'vlan_interface' from source: play vars 33932 1726882904.46884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882904.47093: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882904.47144: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882904.47184: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882904.47218: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882904.47279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882904.47307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882904.47337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.47381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882904.47436: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882904.47976: variable 'network_connections' from source: task vars 33932 1726882904.47995: variable 'interface' from source: play vars 33932 1726882904.48059: variable 'interface' from source: play vars 33932 1726882904.48075: variable 'vlan_interface' from source: play vars 33932 1726882904.48148: variable 'vlan_interface' from source: play vars 33932 1726882904.48182: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 33932 1726882904.48190: when evaluation is False, skipping this task 33932 1726882904.48205: _execute() done 33932 1726882904.48213: dumping result to json 33932 1726882904.48220: done dumping result, returning 33932 1726882904.48230: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-615b-5c48-000000000070] 33932 1726882904.48248: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000070 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 33932 1726882904.48398: no more pending results, returning what we have 33932 1726882904.48401: results queue empty 33932 1726882904.48403: checking for any_errors_fatal 33932 1726882904.48410: done checking for any_errors_fatal 33932 1726882904.48411: checking for max_fail_percentage 33932 1726882904.48413: done checking for max_fail_percentage 33932 1726882904.48414: checking to see if all hosts have failed and the running result is not ok 33932 1726882904.48415: done checking to see if all hosts have failed 33932 1726882904.48415: getting the remaining hosts for this loop 33932 1726882904.48417: done getting the remaining hosts for this loop 33932 1726882904.48421: getting the next task for host managed_node1 33932 1726882904.48427: done getting next task for host managed_node1 33932 1726882904.48431: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33932 1726882904.48434: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882904.48452: getting variables 33932 1726882904.48454: in VariableManager get_vars() 33932 1726882904.48498: Calling all_inventory to load vars for managed_node1 33932 1726882904.48501: Calling groups_inventory to load vars for managed_node1 33932 1726882904.48503: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882904.48514: Calling all_plugins_play to load vars for managed_node1 33932 1726882904.48517: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882904.48520: Calling groups_plugins_play to load vars for managed_node1 33932 1726882904.49450: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000070 33932 1726882904.49453: WORKER PROCESS EXITING 33932 1726882904.49785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882904.50740: done with get_vars() 33932 1726882904.50756: done getting variables 33932 1726882904.50801: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:44 -0400 (0:00:00.090) 0:00:24.975 ****** 33932 1726882904.50825: entering _queue_task() for managed_node1/service 33932 1726882904.51041: worker is 1 (out of 1 available) 33932 1726882904.51055: exiting _queue_task() for managed_node1/service 33932 1726882904.51068: done queuing things up, now waiting for results queue to drain 33932 1726882904.51070: waiting for pending results... 33932 1726882904.51248: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33932 1726882904.51342: in run() - task 0e448fcc-3ce9-615b-5c48-000000000071 33932 1726882904.51354: variable 'ansible_search_path' from source: unknown 33932 1726882904.51358: variable 'ansible_search_path' from source: unknown 33932 1726882904.51392: calling self._execute() 33932 1726882904.51468: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.51474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.51484: variable 'omit' from source: magic vars 33932 1726882904.51758: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.51769: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882904.51882: variable 'network_provider' from source: set_fact 33932 1726882904.51886: variable 'network_state' from source: role '' defaults 33932 1726882904.51894: Evaluated conditional (network_provider == "nm" or network_state != {}): True 33932 1726882904.51900: variable 'omit' from source: magic vars 33932 1726882904.51940: variable 'omit' from source: magic vars 33932 1726882904.51959: variable 'network_service_name' from source: role '' defaults 33932 1726882904.52008: variable 'network_service_name' from source: role '' defaults 33932 1726882904.52085: variable '__network_provider_setup' from source: role '' defaults 33932 1726882904.52090: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882904.52134: variable '__network_service_name_default_nm' from source: role '' defaults 33932 1726882904.52144: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882904.52189: variable '__network_packages_default_nm' from source: role '' defaults 33932 1726882904.52332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882904.54047: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882904.54100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882904.54127: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882904.54152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882904.54174: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882904.54231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.54250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.54271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.54297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.54310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.54341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.54357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.54376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.54401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.54414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.54552: variable '__network_packages_default_gobject_packages' from source: role '' defaults 33932 1726882904.54624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.54643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.54661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.54690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.54700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.54760: variable 'ansible_python' from source: facts 33932 1726882904.54778: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 33932 1726882904.54832: variable '__network_wpa_supplicant_required' from source: role '' defaults 33932 1726882904.54888: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 33932 1726882904.54972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.54988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.55005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.55029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.55039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.55078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882904.55095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882904.55112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.55136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882904.55146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882904.55238: variable 'network_connections' from source: task vars 33932 1726882904.55245: variable 'interface' from source: play vars 33932 1726882904.55300: variable 'interface' from source: play vars 33932 1726882904.55310: variable 'vlan_interface' from source: play vars 33932 1726882904.55359: variable 'vlan_interface' from source: play vars 33932 1726882904.55430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882904.55558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882904.55593: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882904.55625: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882904.55654: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882904.55697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882904.55722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882904.55744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882904.55770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882904.55805: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882904.55977: variable 'network_connections' from source: task vars 33932 1726882904.55983: variable 'interface' from source: play vars 33932 1726882904.56033: variable 'interface' from source: play vars 33932 1726882904.56042: variable 'vlan_interface' from source: play vars 33932 1726882904.56096: variable 'vlan_interface' from source: play vars 33932 1726882904.56119: variable '__network_packages_default_wireless' from source: role '' defaults 33932 1726882904.56175: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882904.56352: variable 'network_connections' from source: task vars 33932 1726882904.56355: variable 'interface' from source: play vars 33932 1726882904.56408: variable 'interface' from source: play vars 33932 1726882904.56414: variable 'vlan_interface' from source: play vars 33932 1726882904.56461: variable 'vlan_interface' from source: play vars 33932 1726882904.56481: variable '__network_packages_default_team' from source: role '' defaults 33932 1726882904.56535: variable '__network_team_connections_defined' from source: role '' defaults 33932 1726882904.56718: variable 'network_connections' from source: task vars 33932 1726882904.56721: variable 'interface' from source: play vars 33932 1726882904.56772: variable 'interface' from source: play vars 33932 1726882904.56775: variable 'vlan_interface' from source: play vars 33932 1726882904.56825: variable 'vlan_interface' from source: play vars 33932 1726882904.56862: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882904.56905: variable '__network_service_name_default_initscripts' from source: role '' defaults 33932 1726882904.56914: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882904.56954: variable '__network_packages_default_initscripts' from source: role '' defaults 33932 1726882904.57093: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 33932 1726882904.57400: variable 'network_connections' from source: task vars 33932 1726882904.57403: variable 'interface' from source: play vars 33932 1726882904.57445: variable 'interface' from source: play vars 33932 1726882904.57455: variable 'vlan_interface' from source: play vars 33932 1726882904.57498: variable 'vlan_interface' from source: play vars 33932 1726882904.57504: variable 'ansible_distribution' from source: facts 33932 1726882904.57507: variable '__network_rh_distros' from source: role '' defaults 33932 1726882904.57512: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.57523: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 33932 1726882904.57636: variable 'ansible_distribution' from source: facts 33932 1726882904.57639: variable '__network_rh_distros' from source: role '' defaults 33932 1726882904.57644: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.57654: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 33932 1726882904.57765: variable 'ansible_distribution' from source: facts 33932 1726882904.57771: variable '__network_rh_distros' from source: role '' defaults 33932 1726882904.57778: variable 'ansible_distribution_major_version' from source: facts 33932 1726882904.57800: variable 'network_provider' from source: set_fact 33932 1726882904.57817: variable 'omit' from source: magic vars 33932 1726882904.57837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882904.57855: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882904.57872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882904.57888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882904.57898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882904.57918: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882904.57921: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.57923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.57990: Set connection var ansible_shell_executable to /bin/sh 33932 1726882904.57997: Set connection var ansible_timeout to 10 33932 1726882904.58002: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882904.58008: Set connection var ansible_pipelining to False 33932 1726882904.58011: Set connection var ansible_connection to ssh 33932 1726882904.58013: Set connection var ansible_shell_type to sh 33932 1726882904.58029: variable 'ansible_shell_executable' from source: unknown 33932 1726882904.58033: variable 'ansible_connection' from source: unknown 33932 1726882904.58036: variable 'ansible_module_compression' from source: unknown 33932 1726882904.58039: variable 'ansible_shell_type' from source: unknown 33932 1726882904.58041: variable 'ansible_shell_executable' from source: unknown 33932 1726882904.58047: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882904.58049: variable 'ansible_pipelining' from source: unknown 33932 1726882904.58051: variable 'ansible_timeout' from source: unknown 33932 1726882904.58053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882904.58120: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882904.58127: variable 'omit' from source: magic vars 33932 1726882904.58133: starting attempt loop 33932 1726882904.58136: running the handler 33932 1726882904.58191: variable 'ansible_facts' from source: unknown 33932 1726882904.58599: _low_level_execute_command(): starting 33932 1726882904.58602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882904.59109: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.59117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882904.59148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.59162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.59178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.59226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882904.59232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882904.59246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882904.59359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882904.61050: stdout chunk (state=3): >>>/root <<< 33932 1726882904.61153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882904.61204: stderr chunk (state=3): >>><<< 33932 1726882904.61207: stdout chunk (state=3): >>><<< 33932 1726882904.61225: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882904.61235: _low_level_execute_command(): starting 33932 1726882904.61239: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944 `" && echo ansible-tmp-1726882904.6122465-35067-140685602643944="` echo /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944 `" ) && sleep 0' 33932 1726882904.61667: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.61681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882904.61702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882904.61714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882904.61723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.61772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882904.61785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882904.61981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882904.63809: stdout chunk (state=3): >>>ansible-tmp-1726882904.6122465-35067-140685602643944=/root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944 <<< 33932 1726882904.63981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882904.64003: stderr chunk (state=3): >>><<< 33932 1726882904.64006: stdout chunk (state=3): >>><<< 33932 1726882904.64020: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882904.6122465-35067-140685602643944=/root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882904.64051: variable 'ansible_module_compression' from source: unknown 33932 1726882904.64103: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 33932 1726882904.64155: variable 'ansible_facts' from source: unknown 33932 1726882904.64339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944/AnsiballZ_systemd.py 33932 1726882904.64478: Sending initial data 33932 1726882904.64481: Sent initial data (156 bytes) 33932 1726882904.65848: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.65854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882904.65912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.65915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882904.65918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.65942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882904.65944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.65995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882904.65998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882904.66102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882904.67995: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882904.68180: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpjo8km484 /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944/AnsiballZ_systemd.py <<< 33932 1726882904.68230: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882904.71609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882904.71613: stdout chunk (state=3): >>><<< 33932 1726882904.71619: stderr chunk (state=3): >>><<< 33932 1726882904.71638: done transferring module to remote 33932 1726882904.71648: _low_level_execute_command(): starting 33932 1726882904.71653: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944/ /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944/AnsiballZ_systemd.py && sleep 0' 33932 1726882904.73224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882904.73233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882904.73243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.73256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882904.73909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882904.73916: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882904.73926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.73940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882904.73947: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882904.73953: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882904.73961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882904.73973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.73984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882904.73991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882904.73997: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882904.74006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.74084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882904.74099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882904.74108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882904.74230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882904.76148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882904.76152: stdout chunk (state=3): >>><<< 33932 1726882904.76158: stderr chunk (state=3): >>><<< 33932 1726882904.76181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882904.76184: _low_level_execute_command(): starting 33932 1726882904.76187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944/AnsiballZ_systemd.py && sleep 0' 33932 1726882904.78062: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882904.78073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882904.78097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.78136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882904.78173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882904.78250: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882904.78258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.78273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882904.78292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882904.78295: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882904.78298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882904.78300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882904.78319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882904.78322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882904.78345: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882904.78348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882904.78431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882904.78590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882904.78593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882904.78751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.04095: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "72917", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ExecMainStartTimestampMonotonic": "1015349250", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "72917", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": <<< 33932 1726882905.04120: stdout chunk (state=3): >>>"system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5449", "MemoryCurrent": "6111232", "MemoryAvailable": "infinity", "CPUUsageNSec": "175946000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal<<< 33932 1726882905.04127: stdout chunk (state=3): >>>": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:57 EDT", "StateChangeTimestampMonotonic": "1015433030", "InactiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveExitTimestampMonotonic": "1015349539", "ActiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveEnterTimestampMonotonic": "1015433030", "ActiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveExitTimestampMonotonic": "1015317264", "InactiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveEnterTimestampMonotonic": "1015342641", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ConditionTimestampMonotonic": "1015343435", "AssertTimestamp": "Fri 2024-09-20 21:40:57 EDT", "AssertTimestampMonotonic": "1015343438", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d73a95d8f1ea4be78e350e6440c36a44", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 33932 1726882905.05692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882905.05751: stderr chunk (state=3): >>><<< 33932 1726882905.05755: stdout chunk (state=3): >>><<< 33932 1726882905.05776: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "72917", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ExecMainStartTimestampMonotonic": "1015349250", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "72917", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:40:57 EDT] ; stop_time=[n/a] ; pid=72917 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5449", "MemoryCurrent": "6111232", "MemoryAvailable": "infinity", "CPUUsageNSec": "175946000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:57 EDT", "StateChangeTimestampMonotonic": "1015433030", "InactiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveExitTimestampMonotonic": "1015349539", "ActiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveEnterTimestampMonotonic": "1015433030", "ActiveExitTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ActiveExitTimestampMonotonic": "1015317264", "InactiveEnterTimestamp": "Fri 2024-09-20 21:40:57 EDT", "InactiveEnterTimestampMonotonic": "1015342641", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:40:57 EDT", "ConditionTimestampMonotonic": "1015343435", "AssertTimestamp": "Fri 2024-09-20 21:40:57 EDT", "AssertTimestampMonotonic": "1015343438", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d73a95d8f1ea4be78e350e6440c36a44", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882905.05892: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882905.05906: _low_level_execute_command(): starting 33932 1726882905.05911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882904.6122465-35067-140685602643944/ > /dev/null 2>&1 && sleep 0' 33932 1726882905.06375: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.06379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.06415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.06418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.06421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.06470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882905.06482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882905.06587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.08413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882905.08461: stderr chunk (state=3): >>><<< 33932 1726882905.08465: stdout chunk (state=3): >>><<< 33932 1726882905.08480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882905.08487: handler run complete 33932 1726882905.08523: attempt loop complete, returning result 33932 1726882905.08525: _execute() done 33932 1726882905.08528: dumping result to json 33932 1726882905.08539: done dumping result, returning 33932 1726882905.08547: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-615b-5c48-000000000071] 33932 1726882905.08551: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000071 33932 1726882905.09014: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000071 33932 1726882905.09017: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882905.09053: no more pending results, returning what we have 33932 1726882905.09055: results queue empty 33932 1726882905.09055: checking for any_errors_fatal 33932 1726882905.09058: done checking for any_errors_fatal 33932 1726882905.09059: checking for max_fail_percentage 33932 1726882905.09060: done checking for max_fail_percentage 33932 1726882905.09060: checking to see if all hosts have failed and the running result is not ok 33932 1726882905.09061: done checking to see if all hosts have failed 33932 1726882905.09061: getting the remaining hosts for this loop 33932 1726882905.09062: done getting the remaining hosts for this loop 33932 1726882905.09066: getting the next task for host managed_node1 33932 1726882905.09070: done getting next task for host managed_node1 33932 1726882905.09073: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33932 1726882905.09075: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882905.09083: getting variables 33932 1726882905.09084: in VariableManager get_vars() 33932 1726882905.09108: Calling all_inventory to load vars for managed_node1 33932 1726882905.09110: Calling groups_inventory to load vars for managed_node1 33932 1726882905.09111: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882905.09118: Calling all_plugins_play to load vars for managed_node1 33932 1726882905.09120: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882905.09122: Calling groups_plugins_play to load vars for managed_node1 33932 1726882905.09823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882905.10755: done with get_vars() 33932 1726882905.10772: done getting variables 33932 1726882905.10814: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:45 -0400 (0:00:00.600) 0:00:25.576 ****** 33932 1726882905.10839: entering _queue_task() for managed_node1/service 33932 1726882905.11055: worker is 1 (out of 1 available) 33932 1726882905.11071: exiting _queue_task() for managed_node1/service 33932 1726882905.11083: done queuing things up, now waiting for results queue to drain 33932 1726882905.11085: waiting for pending results... 33932 1726882905.11261: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33932 1726882905.11354: in run() - task 0e448fcc-3ce9-615b-5c48-000000000072 33932 1726882905.11368: variable 'ansible_search_path' from source: unknown 33932 1726882905.11377: variable 'ansible_search_path' from source: unknown 33932 1726882905.11407: calling self._execute() 33932 1726882905.11493: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882905.11497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882905.11506: variable 'omit' from source: magic vars 33932 1726882905.11792: variable 'ansible_distribution_major_version' from source: facts 33932 1726882905.11803: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882905.11888: variable 'network_provider' from source: set_fact 33932 1726882905.11892: Evaluated conditional (network_provider == "nm"): True 33932 1726882905.11956: variable '__network_wpa_supplicant_required' from source: role '' defaults 33932 1726882905.12024: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 33932 1726882905.12141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882905.13664: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882905.13713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882905.13740: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882905.13766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882905.13791: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882905.13965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882905.13992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882905.14007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882905.14035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882905.14045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882905.14079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882905.14099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882905.14115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882905.14140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882905.14151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882905.14182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882905.14199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882905.14217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882905.14242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882905.14252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882905.14353: variable 'network_connections' from source: task vars 33932 1726882905.14362: variable 'interface' from source: play vars 33932 1726882905.14411: variable 'interface' from source: play vars 33932 1726882905.14417: variable 'vlan_interface' from source: play vars 33932 1726882905.14462: variable 'vlan_interface' from source: play vars 33932 1726882905.14513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882905.14623: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882905.14652: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882905.14679: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882905.14701: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882905.14731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33932 1726882905.14749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33932 1726882905.14768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882905.14789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33932 1726882905.14827: variable '__network_wireless_connections_defined' from source: role '' defaults 33932 1726882905.14994: variable 'network_connections' from source: task vars 33932 1726882905.14997: variable 'interface' from source: play vars 33932 1726882905.15039: variable 'interface' from source: play vars 33932 1726882905.15045: variable 'vlan_interface' from source: play vars 33932 1726882905.15092: variable 'vlan_interface' from source: play vars 33932 1726882905.15114: Evaluated conditional (__network_wpa_supplicant_required): False 33932 1726882905.15117: when evaluation is False, skipping this task 33932 1726882905.15128: _execute() done 33932 1726882905.15131: dumping result to json 33932 1726882905.15133: done dumping result, returning 33932 1726882905.15136: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-615b-5c48-000000000072] 33932 1726882905.15138: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000072 33932 1726882905.15228: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000072 33932 1726882905.15230: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 33932 1726882905.15272: no more pending results, returning what we have 33932 1726882905.15276: results queue empty 33932 1726882905.15277: checking for any_errors_fatal 33932 1726882905.15300: done checking for any_errors_fatal 33932 1726882905.15301: checking for max_fail_percentage 33932 1726882905.15303: done checking for max_fail_percentage 33932 1726882905.15304: checking to see if all hosts have failed and the running result is not ok 33932 1726882905.15305: done checking to see if all hosts have failed 33932 1726882905.15306: getting the remaining hosts for this loop 33932 1726882905.15307: done getting the remaining hosts for this loop 33932 1726882905.15311: getting the next task for host managed_node1 33932 1726882905.15317: done getting next task for host managed_node1 33932 1726882905.15320: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33932 1726882905.15323: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882905.15339: getting variables 33932 1726882905.15341: in VariableManager get_vars() 33932 1726882905.15378: Calling all_inventory to load vars for managed_node1 33932 1726882905.15381: Calling groups_inventory to load vars for managed_node1 33932 1726882905.15383: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882905.15396: Calling all_plugins_play to load vars for managed_node1 33932 1726882905.15399: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882905.15405: Calling groups_plugins_play to load vars for managed_node1 33932 1726882905.16282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882905.17224: done with get_vars() 33932 1726882905.17241: done getting variables 33932 1726882905.17284: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:45 -0400 (0:00:00.064) 0:00:25.640 ****** 33932 1726882905.17306: entering _queue_task() for managed_node1/service 33932 1726882905.17511: worker is 1 (out of 1 available) 33932 1726882905.17526: exiting _queue_task() for managed_node1/service 33932 1726882905.17538: done queuing things up, now waiting for results queue to drain 33932 1726882905.17540: waiting for pending results... 33932 1726882905.17710: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 33932 1726882905.17799: in run() - task 0e448fcc-3ce9-615b-5c48-000000000073 33932 1726882905.17810: variable 'ansible_search_path' from source: unknown 33932 1726882905.17814: variable 'ansible_search_path' from source: unknown 33932 1726882905.17845: calling self._execute() 33932 1726882905.17922: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882905.17926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882905.17932: variable 'omit' from source: magic vars 33932 1726882905.18200: variable 'ansible_distribution_major_version' from source: facts 33932 1726882905.18210: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882905.18288: variable 'network_provider' from source: set_fact 33932 1726882905.18292: Evaluated conditional (network_provider == "initscripts"): False 33932 1726882905.18295: when evaluation is False, skipping this task 33932 1726882905.18298: _execute() done 33932 1726882905.18302: dumping result to json 33932 1726882905.18305: done dumping result, returning 33932 1726882905.18311: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-615b-5c48-000000000073] 33932 1726882905.18318: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000073 33932 1726882905.18403: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000073 33932 1726882905.18406: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33932 1726882905.18462: no more pending results, returning what we have 33932 1726882905.18467: results queue empty 33932 1726882905.18468: checking for any_errors_fatal 33932 1726882905.18473: done checking for any_errors_fatal 33932 1726882905.18474: checking for max_fail_percentage 33932 1726882905.18475: done checking for max_fail_percentage 33932 1726882905.18476: checking to see if all hosts have failed and the running result is not ok 33932 1726882905.18477: done checking to see if all hosts have failed 33932 1726882905.18478: getting the remaining hosts for this loop 33932 1726882905.18479: done getting the remaining hosts for this loop 33932 1726882905.18482: getting the next task for host managed_node1 33932 1726882905.18487: done getting next task for host managed_node1 33932 1726882905.18491: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33932 1726882905.18494: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882905.18508: getting variables 33932 1726882905.18509: in VariableManager get_vars() 33932 1726882905.18549: Calling all_inventory to load vars for managed_node1 33932 1726882905.18551: Calling groups_inventory to load vars for managed_node1 33932 1726882905.18553: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882905.18559: Calling all_plugins_play to load vars for managed_node1 33932 1726882905.18561: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882905.18563: Calling groups_plugins_play to load vars for managed_node1 33932 1726882905.19328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882905.20273: done with get_vars() 33932 1726882905.20288: done getting variables 33932 1726882905.20327: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:45 -0400 (0:00:00.030) 0:00:25.671 ****** 33932 1726882905.20349: entering _queue_task() for managed_node1/copy 33932 1726882905.20537: worker is 1 (out of 1 available) 33932 1726882905.20551: exiting _queue_task() for managed_node1/copy 33932 1726882905.20561: done queuing things up, now waiting for results queue to drain 33932 1726882905.20565: waiting for pending results... 33932 1726882905.20731: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33932 1726882905.20810: in run() - task 0e448fcc-3ce9-615b-5c48-000000000074 33932 1726882905.20822: variable 'ansible_search_path' from source: unknown 33932 1726882905.20826: variable 'ansible_search_path' from source: unknown 33932 1726882905.20853: calling self._execute() 33932 1726882905.20931: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882905.20935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882905.20943: variable 'omit' from source: magic vars 33932 1726882905.21209: variable 'ansible_distribution_major_version' from source: facts 33932 1726882905.21220: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882905.21300: variable 'network_provider' from source: set_fact 33932 1726882905.21304: Evaluated conditional (network_provider == "initscripts"): False 33932 1726882905.21307: when evaluation is False, skipping this task 33932 1726882905.21310: _execute() done 33932 1726882905.21313: dumping result to json 33932 1726882905.21315: done dumping result, returning 33932 1726882905.21323: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-615b-5c48-000000000074] 33932 1726882905.21327: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000074 33932 1726882905.21416: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000074 33932 1726882905.21419: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 33932 1726882905.21483: no more pending results, returning what we have 33932 1726882905.21486: results queue empty 33932 1726882905.21487: checking for any_errors_fatal 33932 1726882905.21491: done checking for any_errors_fatal 33932 1726882905.21492: checking for max_fail_percentage 33932 1726882905.21494: done checking for max_fail_percentage 33932 1726882905.21495: checking to see if all hosts have failed and the running result is not ok 33932 1726882905.21495: done checking to see if all hosts have failed 33932 1726882905.21496: getting the remaining hosts for this loop 33932 1726882905.21497: done getting the remaining hosts for this loop 33932 1726882905.21500: getting the next task for host managed_node1 33932 1726882905.21505: done getting next task for host managed_node1 33932 1726882905.21508: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33932 1726882905.21511: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882905.21525: getting variables 33932 1726882905.21526: in VariableManager get_vars() 33932 1726882905.21559: Calling all_inventory to load vars for managed_node1 33932 1726882905.21561: Calling groups_inventory to load vars for managed_node1 33932 1726882905.21563: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882905.21572: Calling all_plugins_play to load vars for managed_node1 33932 1726882905.21574: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882905.21576: Calling groups_plugins_play to load vars for managed_node1 33932 1726882905.22436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882905.23368: done with get_vars() 33932 1726882905.23384: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:45 -0400 (0:00:00.030) 0:00:25.702 ****** 33932 1726882905.23444: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33932 1726882905.23661: worker is 1 (out of 1 available) 33932 1726882905.23677: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 33932 1726882905.23689: done queuing things up, now waiting for results queue to drain 33932 1726882905.23691: waiting for pending results... 33932 1726882905.23866: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33932 1726882905.23959: in run() - task 0e448fcc-3ce9-615b-5c48-000000000075 33932 1726882905.23976: variable 'ansible_search_path' from source: unknown 33932 1726882905.23979: variable 'ansible_search_path' from source: unknown 33932 1726882905.24008: calling self._execute() 33932 1726882905.24086: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882905.24090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882905.24097: variable 'omit' from source: magic vars 33932 1726882905.24370: variable 'ansible_distribution_major_version' from source: facts 33932 1726882905.24382: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882905.24387: variable 'omit' from source: magic vars 33932 1726882905.24426: variable 'omit' from source: magic vars 33932 1726882905.24540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882905.26086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882905.26131: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882905.26157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882905.26188: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882905.26209: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882905.26267: variable 'network_provider' from source: set_fact 33932 1726882905.26359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882905.26394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882905.26411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882905.26442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882905.26452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882905.26507: variable 'omit' from source: magic vars 33932 1726882905.26587: variable 'omit' from source: magic vars 33932 1726882905.26659: variable 'network_connections' from source: task vars 33932 1726882905.26667: variable 'interface' from source: play vars 33932 1726882905.26713: variable 'interface' from source: play vars 33932 1726882905.26721: variable 'vlan_interface' from source: play vars 33932 1726882905.26767: variable 'vlan_interface' from source: play vars 33932 1726882905.26874: variable 'omit' from source: magic vars 33932 1726882905.26885: variable '__lsr_ansible_managed' from source: task vars 33932 1726882905.26926: variable '__lsr_ansible_managed' from source: task vars 33932 1726882905.27122: Loaded config def from plugin (lookup/template) 33932 1726882905.27126: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 33932 1726882905.27146: File lookup term: get_ansible_managed.j2 33932 1726882905.27150: variable 'ansible_search_path' from source: unknown 33932 1726882905.27153: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 33932 1726882905.27165: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 33932 1726882905.27182: variable 'ansible_search_path' from source: unknown 33932 1726882905.31117: variable 'ansible_managed' from source: unknown 33932 1726882905.31240: variable 'omit' from source: magic vars 33932 1726882905.31275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882905.31306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882905.31328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882905.31350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882905.31368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882905.31400: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882905.31411: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882905.31420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882905.31518: Set connection var ansible_shell_executable to /bin/sh 33932 1726882905.31532: Set connection var ansible_timeout to 10 33932 1726882905.31543: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882905.31553: Set connection var ansible_pipelining to False 33932 1726882905.31559: Set connection var ansible_connection to ssh 33932 1726882905.31569: Set connection var ansible_shell_type to sh 33932 1726882905.31599: variable 'ansible_shell_executable' from source: unknown 33932 1726882905.31602: variable 'ansible_connection' from source: unknown 33932 1726882905.31604: variable 'ansible_module_compression' from source: unknown 33932 1726882905.31607: variable 'ansible_shell_type' from source: unknown 33932 1726882905.31609: variable 'ansible_shell_executable' from source: unknown 33932 1726882905.31611: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882905.31613: variable 'ansible_pipelining' from source: unknown 33932 1726882905.31616: variable 'ansible_timeout' from source: unknown 33932 1726882905.31621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882905.31739: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882905.31753: variable 'omit' from source: magic vars 33932 1726882905.31756: starting attempt loop 33932 1726882905.31758: running the handler 33932 1726882905.31765: _low_level_execute_command(): starting 33932 1726882905.31776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882905.32260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.32279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.32298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.32313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.32355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882905.32369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882905.32481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.34165: stdout chunk (state=3): >>>/root <<< 33932 1726882905.34335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882905.34346: stdout chunk (state=3): >>><<< 33932 1726882905.34358: stderr chunk (state=3): >>><<< 33932 1726882905.34387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882905.34404: _low_level_execute_command(): starting 33932 1726882905.34413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981 `" && echo ansible-tmp-1726882905.343931-35094-231368377645981="` echo /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981 `" ) && sleep 0' 33932 1726882905.35034: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882905.35048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.35065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.35085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.35125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882905.35138: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882905.35155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.35174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882905.35185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882905.35195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882905.35205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.35219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.35235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.35246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882905.35262: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882905.35277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.35352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882905.35381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882905.35396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882905.35517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.37406: stdout chunk (state=3): >>>ansible-tmp-1726882905.343931-35094-231368377645981=/root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981 <<< 33932 1726882905.37516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882905.37583: stderr chunk (state=3): >>><<< 33932 1726882905.37586: stdout chunk (state=3): >>><<< 33932 1726882905.37677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882905.343931-35094-231368377645981=/root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882905.37680: variable 'ansible_module_compression' from source: unknown 33932 1726882905.37875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 33932 1726882905.37879: variable 'ansible_facts' from source: unknown 33932 1726882905.37882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981/AnsiballZ_network_connections.py 33932 1726882905.38005: Sending initial data 33932 1726882905.38008: Sent initial data (167 bytes) 33932 1726882905.39199: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882905.39489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.39509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.39527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.39574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882905.39587: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882905.39600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.39621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882905.39632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882905.39642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882905.39652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.39665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.39684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.39697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882905.39707: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882905.39719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.39804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882905.39825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882905.39848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882905.39969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.41770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882905.41861: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882905.41954: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp4ed7uu0k /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981/AnsiballZ_network_connections.py <<< 33932 1726882905.42041: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882905.44470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882905.44571: stderr chunk (state=3): >>><<< 33932 1726882905.44578: stdout chunk (state=3): >>><<< 33932 1726882905.44691: done transferring module to remote 33932 1726882905.44694: _low_level_execute_command(): starting 33932 1726882905.44696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981/ /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981/AnsiballZ_network_connections.py && sleep 0' 33932 1726882905.45371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.45375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.45412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882905.45416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.45418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.45489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882905.45492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882905.45496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882905.45593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.47488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882905.47491: stdout chunk (state=3): >>><<< 33932 1726882905.47494: stderr chunk (state=3): >>><<< 33932 1726882905.47578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882905.47581: _low_level_execute_command(): starting 33932 1726882905.47584: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981/AnsiballZ_network_connections.py && sleep 0' 33932 1726882905.49069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.49073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.49112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.49115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.49121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.49230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882905.49365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882905.49565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.85383: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/bf4b0bae-03ff-4dc6-a59c-c7d19007aec3: error=unknown <<< 33932 1726882905.86867: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 33932 1726882905.86884: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/6b10c778-6398-4428-94af-1aa693ccf4b1: error=unknown <<< 33932 1726882905.87110: stdout chunk (state=3): >>> <<< 33932 1726882905.87114: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 33932 1726882905.88828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882905.88832: stdout chunk (state=3): >>><<< 33932 1726882905.88834: stderr chunk (state=3): >>><<< 33932 1726882905.88988: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/bf4b0bae-03ff-4dc6-a59c-c7d19007aec3: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_pxs1vva3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/6b10c778-6398-4428-94af-1aa693ccf4b1: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882905.88992: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'lsr101.90', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882905.89001: _low_level_execute_command(): starting 33932 1726882905.89004: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882905.343931-35094-231368377645981/ > /dev/null 2>&1 && sleep 0' 33932 1726882905.90245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882905.90883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.90898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.90915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.90958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882905.90974: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882905.90988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.91004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882905.91015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882905.91024: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882905.91034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882905.91045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882905.91059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882905.91079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882905.91090: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882905.91102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882905.91182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882905.91198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882905.91212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882905.91339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882905.93320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882905.93323: stdout chunk (state=3): >>><<< 33932 1726882905.93325: stderr chunk (state=3): >>><<< 33932 1726882905.93371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882905.93375: handler run complete 33932 1726882905.93674: attempt loop complete, returning result 33932 1726882905.93677: _execute() done 33932 1726882905.93679: dumping result to json 33932 1726882905.93681: done dumping result, returning 33932 1726882905.93683: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-615b-5c48-000000000075] 33932 1726882905.93685: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000075 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 33932 1726882905.93867: no more pending results, returning what we have 33932 1726882905.93870: results queue empty 33932 1726882905.93871: checking for any_errors_fatal 33932 1726882905.93877: done checking for any_errors_fatal 33932 1726882905.93878: checking for max_fail_percentage 33932 1726882905.93879: done checking for max_fail_percentage 33932 1726882905.93880: checking to see if all hosts have failed and the running result is not ok 33932 1726882905.93881: done checking to see if all hosts have failed 33932 1726882905.93882: getting the remaining hosts for this loop 33932 1726882905.93883: done getting the remaining hosts for this loop 33932 1726882905.93886: getting the next task for host managed_node1 33932 1726882905.93891: done getting next task for host managed_node1 33932 1726882905.93894: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33932 1726882905.93897: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882905.93910: getting variables 33932 1726882905.93911: in VariableManager get_vars() 33932 1726882905.93948: Calling all_inventory to load vars for managed_node1 33932 1726882905.93951: Calling groups_inventory to load vars for managed_node1 33932 1726882905.93953: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882905.93966: Calling all_plugins_play to load vars for managed_node1 33932 1726882905.93970: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882905.93974: Calling groups_plugins_play to load vars for managed_node1 33932 1726882905.95679: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000075 33932 1726882905.95684: WORKER PROCESS EXITING 33932 1726882905.96658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.00252: done with get_vars() 33932 1726882906.00279: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:46 -0400 (0:00:00.769) 0:00:26.471 ****** 33932 1726882906.00370: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33932 1726882906.01130: worker is 1 (out of 1 available) 33932 1726882906.01143: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 33932 1726882906.01154: done queuing things up, now waiting for results queue to drain 33932 1726882906.01156: waiting for pending results... 33932 1726882906.01950: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 33932 1726882906.02206: in run() - task 0e448fcc-3ce9-615b-5c48-000000000076 33932 1726882906.02235: variable 'ansible_search_path' from source: unknown 33932 1726882906.02242: variable 'ansible_search_path' from source: unknown 33932 1726882906.02294: calling self._execute() 33932 1726882906.02436: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.02447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.02461: variable 'omit' from source: magic vars 33932 1726882906.03194: variable 'ansible_distribution_major_version' from source: facts 33932 1726882906.03291: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882906.03520: variable 'network_state' from source: role '' defaults 33932 1726882906.03588: Evaluated conditional (network_state != {}): False 33932 1726882906.03681: when evaluation is False, skipping this task 33932 1726882906.03693: _execute() done 33932 1726882906.03702: dumping result to json 33932 1726882906.03710: done dumping result, returning 33932 1726882906.03720: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-615b-5c48-000000000076] 33932 1726882906.03731: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000076 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 33932 1726882906.03897: no more pending results, returning what we have 33932 1726882906.03902: results queue empty 33932 1726882906.03903: checking for any_errors_fatal 33932 1726882906.03916: done checking for any_errors_fatal 33932 1726882906.03917: checking for max_fail_percentage 33932 1726882906.03919: done checking for max_fail_percentage 33932 1726882906.03920: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.03921: done checking to see if all hosts have failed 33932 1726882906.03921: getting the remaining hosts for this loop 33932 1726882906.03923: done getting the remaining hosts for this loop 33932 1726882906.03927: getting the next task for host managed_node1 33932 1726882906.03933: done getting next task for host managed_node1 33932 1726882906.03937: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33932 1726882906.03941: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.03955: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000076 33932 1726882906.03959: WORKER PROCESS EXITING 33932 1726882906.03974: getting variables 33932 1726882906.03976: in VariableManager get_vars() 33932 1726882906.04020: Calling all_inventory to load vars for managed_node1 33932 1726882906.04023: Calling groups_inventory to load vars for managed_node1 33932 1726882906.04026: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.04040: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.04044: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.04047: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.05631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.07619: done with get_vars() 33932 1726882906.07644: done getting variables 33932 1726882906.07708: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:46 -0400 (0:00:00.073) 0:00:26.545 ****** 33932 1726882906.07741: entering _queue_task() for managed_node1/debug 33932 1726882906.08057: worker is 1 (out of 1 available) 33932 1726882906.08072: exiting _queue_task() for managed_node1/debug 33932 1726882906.08084: done queuing things up, now waiting for results queue to drain 33932 1726882906.08086: waiting for pending results... 33932 1726882906.08555: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33932 1726882906.08679: in run() - task 0e448fcc-3ce9-615b-5c48-000000000077 33932 1726882906.08694: variable 'ansible_search_path' from source: unknown 33932 1726882906.08698: variable 'ansible_search_path' from source: unknown 33932 1726882906.08738: calling self._execute() 33932 1726882906.08838: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.08844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.08858: variable 'omit' from source: magic vars 33932 1726882906.09244: variable 'ansible_distribution_major_version' from source: facts 33932 1726882906.09261: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882906.09271: variable 'omit' from source: magic vars 33932 1726882906.09330: variable 'omit' from source: magic vars 33932 1726882906.09372: variable 'omit' from source: magic vars 33932 1726882906.09415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882906.09449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882906.09474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882906.09492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882906.09508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882906.09539: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882906.09542: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.09544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.09656: Set connection var ansible_shell_executable to /bin/sh 33932 1726882906.09662: Set connection var ansible_timeout to 10 33932 1726882906.09672: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882906.09676: Set connection var ansible_pipelining to False 33932 1726882906.09679: Set connection var ansible_connection to ssh 33932 1726882906.09681: Set connection var ansible_shell_type to sh 33932 1726882906.09710: variable 'ansible_shell_executable' from source: unknown 33932 1726882906.09713: variable 'ansible_connection' from source: unknown 33932 1726882906.09716: variable 'ansible_module_compression' from source: unknown 33932 1726882906.09721: variable 'ansible_shell_type' from source: unknown 33932 1726882906.09726: variable 'ansible_shell_executable' from source: unknown 33932 1726882906.09734: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.09738: variable 'ansible_pipelining' from source: unknown 33932 1726882906.09741: variable 'ansible_timeout' from source: unknown 33932 1726882906.09745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.09889: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882906.09899: variable 'omit' from source: magic vars 33932 1726882906.09910: starting attempt loop 33932 1726882906.09913: running the handler 33932 1726882906.10045: variable '__network_connections_result' from source: set_fact 33932 1726882906.10101: handler run complete 33932 1726882906.10118: attempt loop complete, returning result 33932 1726882906.10126: _execute() done 33932 1726882906.10129: dumping result to json 33932 1726882906.10131: done dumping result, returning 33932 1726882906.10142: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-615b-5c48-000000000077] 33932 1726882906.10147: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000077 33932 1726882906.10238: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000077 33932 1726882906.10241: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 33932 1726882906.10330: no more pending results, returning what we have 33932 1726882906.10336: results queue empty 33932 1726882906.10337: checking for any_errors_fatal 33932 1726882906.10344: done checking for any_errors_fatal 33932 1726882906.10345: checking for max_fail_percentage 33932 1726882906.10347: done checking for max_fail_percentage 33932 1726882906.10348: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.10349: done checking to see if all hosts have failed 33932 1726882906.10350: getting the remaining hosts for this loop 33932 1726882906.10352: done getting the remaining hosts for this loop 33932 1726882906.10356: getting the next task for host managed_node1 33932 1726882906.10363: done getting next task for host managed_node1 33932 1726882906.10369: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33932 1726882906.10373: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.10384: getting variables 33932 1726882906.10386: in VariableManager get_vars() 33932 1726882906.10454: Calling all_inventory to load vars for managed_node1 33932 1726882906.10457: Calling groups_inventory to load vars for managed_node1 33932 1726882906.10460: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.10473: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.10477: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.10480: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.12336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.14545: done with get_vars() 33932 1726882906.14569: done getting variables 33932 1726882906.14626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:46 -0400 (0:00:00.069) 0:00:26.614 ****** 33932 1726882906.14672: entering _queue_task() for managed_node1/debug 33932 1726882906.14974: worker is 1 (out of 1 available) 33932 1726882906.14986: exiting _queue_task() for managed_node1/debug 33932 1726882906.14997: done queuing things up, now waiting for results queue to drain 33932 1726882906.14999: waiting for pending results... 33932 1726882906.15411: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33932 1726882906.15583: in run() - task 0e448fcc-3ce9-615b-5c48-000000000078 33932 1726882906.15601: variable 'ansible_search_path' from source: unknown 33932 1726882906.15605: variable 'ansible_search_path' from source: unknown 33932 1726882906.15644: calling self._execute() 33932 1726882906.15753: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.15757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.15773: variable 'omit' from source: magic vars 33932 1726882906.16149: variable 'ansible_distribution_major_version' from source: facts 33932 1726882906.16161: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882906.16173: variable 'omit' from source: magic vars 33932 1726882906.16225: variable 'omit' from source: magic vars 33932 1726882906.16253: variable 'omit' from source: magic vars 33932 1726882906.16295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882906.16329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882906.16347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882906.16361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882906.16375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882906.16407: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882906.16416: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.16418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.16535: Set connection var ansible_shell_executable to /bin/sh 33932 1726882906.16542: Set connection var ansible_timeout to 10 33932 1726882906.16548: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882906.16553: Set connection var ansible_pipelining to False 33932 1726882906.16555: Set connection var ansible_connection to ssh 33932 1726882906.16558: Set connection var ansible_shell_type to sh 33932 1726882906.16584: variable 'ansible_shell_executable' from source: unknown 33932 1726882906.16587: variable 'ansible_connection' from source: unknown 33932 1726882906.16590: variable 'ansible_module_compression' from source: unknown 33932 1726882906.16592: variable 'ansible_shell_type' from source: unknown 33932 1726882906.16595: variable 'ansible_shell_executable' from source: unknown 33932 1726882906.16597: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.16599: variable 'ansible_pipelining' from source: unknown 33932 1726882906.16601: variable 'ansible_timeout' from source: unknown 33932 1726882906.16612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.16762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882906.16774: variable 'omit' from source: magic vars 33932 1726882906.16779: starting attempt loop 33932 1726882906.16783: running the handler 33932 1726882906.16838: variable '__network_connections_result' from source: set_fact 33932 1726882906.16917: variable '__network_connections_result' from source: set_fact 33932 1726882906.17048: handler run complete 33932 1726882906.17078: attempt loop complete, returning result 33932 1726882906.17081: _execute() done 33932 1726882906.17084: dumping result to json 33932 1726882906.17086: done dumping result, returning 33932 1726882906.17100: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-615b-5c48-000000000078] 33932 1726882906.17102: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000078 33932 1726882906.17200: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000078 33932 1726882906.17203: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 33932 1726882906.17300: no more pending results, returning what we have 33932 1726882906.17304: results queue empty 33932 1726882906.17305: checking for any_errors_fatal 33932 1726882906.17314: done checking for any_errors_fatal 33932 1726882906.17315: checking for max_fail_percentage 33932 1726882906.17317: done checking for max_fail_percentage 33932 1726882906.17318: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.17319: done checking to see if all hosts have failed 33932 1726882906.17320: getting the remaining hosts for this loop 33932 1726882906.17321: done getting the remaining hosts for this loop 33932 1726882906.17325: getting the next task for host managed_node1 33932 1726882906.17332: done getting next task for host managed_node1 33932 1726882906.17336: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33932 1726882906.17340: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.17352: getting variables 33932 1726882906.17354: in VariableManager get_vars() 33932 1726882906.17419: Calling all_inventory to load vars for managed_node1 33932 1726882906.17422: Calling groups_inventory to load vars for managed_node1 33932 1726882906.17425: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.17436: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.17440: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.17443: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.20261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.22187: done with get_vars() 33932 1726882906.22212: done getting variables 33932 1726882906.22273: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:46 -0400 (0:00:00.076) 0:00:26.690 ****** 33932 1726882906.22316: entering _queue_task() for managed_node1/debug 33932 1726882906.22642: worker is 1 (out of 1 available) 33932 1726882906.22656: exiting _queue_task() for managed_node1/debug 33932 1726882906.22670: done queuing things up, now waiting for results queue to drain 33932 1726882906.22672: waiting for pending results... 33932 1726882906.22984: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33932 1726882906.23173: in run() - task 0e448fcc-3ce9-615b-5c48-000000000079 33932 1726882906.23177: variable 'ansible_search_path' from source: unknown 33932 1726882906.23179: variable 'ansible_search_path' from source: unknown 33932 1726882906.23182: calling self._execute() 33932 1726882906.23397: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.23400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.23403: variable 'omit' from source: magic vars 33932 1726882906.23750: variable 'ansible_distribution_major_version' from source: facts 33932 1726882906.23762: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882906.23897: variable 'network_state' from source: role '' defaults 33932 1726882906.23908: Evaluated conditional (network_state != {}): False 33932 1726882906.23911: when evaluation is False, skipping this task 33932 1726882906.23914: _execute() done 33932 1726882906.23917: dumping result to json 33932 1726882906.23919: done dumping result, returning 33932 1726882906.23928: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-615b-5c48-000000000079] 33932 1726882906.23933: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000079 33932 1726882906.24032: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000079 33932 1726882906.24036: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 33932 1726882906.24087: no more pending results, returning what we have 33932 1726882906.24091: results queue empty 33932 1726882906.24092: checking for any_errors_fatal 33932 1726882906.24102: done checking for any_errors_fatal 33932 1726882906.24103: checking for max_fail_percentage 33932 1726882906.24105: done checking for max_fail_percentage 33932 1726882906.24106: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.24106: done checking to see if all hosts have failed 33932 1726882906.24107: getting the remaining hosts for this loop 33932 1726882906.24109: done getting the remaining hosts for this loop 33932 1726882906.24113: getting the next task for host managed_node1 33932 1726882906.24119: done getting next task for host managed_node1 33932 1726882906.24123: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33932 1726882906.24126: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.24145: getting variables 33932 1726882906.24147: in VariableManager get_vars() 33932 1726882906.24189: Calling all_inventory to load vars for managed_node1 33932 1726882906.24192: Calling groups_inventory to load vars for managed_node1 33932 1726882906.24195: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.24207: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.24210: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.24213: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.26100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.27977: done with get_vars() 33932 1726882906.28007: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:46 -0400 (0:00:00.057) 0:00:26.748 ****** 33932 1726882906.28110: entering _queue_task() for managed_node1/ping 33932 1726882906.28460: worker is 1 (out of 1 available) 33932 1726882906.28479: exiting _queue_task() for managed_node1/ping 33932 1726882906.28492: done queuing things up, now waiting for results queue to drain 33932 1726882906.28494: waiting for pending results... 33932 1726882906.28782: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 33932 1726882906.28922: in run() - task 0e448fcc-3ce9-615b-5c48-00000000007a 33932 1726882906.28941: variable 'ansible_search_path' from source: unknown 33932 1726882906.28944: variable 'ansible_search_path' from source: unknown 33932 1726882906.28984: calling self._execute() 33932 1726882906.29091: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.29094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.29104: variable 'omit' from source: magic vars 33932 1726882906.29514: variable 'ansible_distribution_major_version' from source: facts 33932 1726882906.29526: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882906.29532: variable 'omit' from source: magic vars 33932 1726882906.29604: variable 'omit' from source: magic vars 33932 1726882906.29638: variable 'omit' from source: magic vars 33932 1726882906.29688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882906.29722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882906.29742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882906.29760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882906.29775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882906.29811: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882906.29815: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.29817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.29933: Set connection var ansible_shell_executable to /bin/sh 33932 1726882906.29941: Set connection var ansible_timeout to 10 33932 1726882906.29946: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882906.29951: Set connection var ansible_pipelining to False 33932 1726882906.29954: Set connection var ansible_connection to ssh 33932 1726882906.29956: Set connection var ansible_shell_type to sh 33932 1726882906.29984: variable 'ansible_shell_executable' from source: unknown 33932 1726882906.29987: variable 'ansible_connection' from source: unknown 33932 1726882906.29990: variable 'ansible_module_compression' from source: unknown 33932 1726882906.29993: variable 'ansible_shell_type' from source: unknown 33932 1726882906.29998: variable 'ansible_shell_executable' from source: unknown 33932 1726882906.30000: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.30005: variable 'ansible_pipelining' from source: unknown 33932 1726882906.30013: variable 'ansible_timeout' from source: unknown 33932 1726882906.30017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.30237: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 33932 1726882906.30248: variable 'omit' from source: magic vars 33932 1726882906.30251: starting attempt loop 33932 1726882906.30253: running the handler 33932 1726882906.30273: _low_level_execute_command(): starting 33932 1726882906.30278: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882906.31073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882906.31084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.31100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.31119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.31159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.31171: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882906.31179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.31192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882906.31206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882906.31213: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882906.31226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.31236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.31247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.31255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.31262: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882906.31275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.31353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882906.31376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882906.31389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882906.31518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882906.33199: stdout chunk (state=3): >>>/root <<< 33932 1726882906.33315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882906.33405: stderr chunk (state=3): >>><<< 33932 1726882906.33413: stdout chunk (state=3): >>><<< 33932 1726882906.33446: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882906.33459: _low_level_execute_command(): starting 33932 1726882906.33465: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357 `" && echo ansible-tmp-1726882906.33444-35134-257153363111357="` echo /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357 `" ) && sleep 0' 33932 1726882906.34206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882906.34221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.34232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.34245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.34286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.34294: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882906.34304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.34319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882906.34334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882906.34339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882906.34348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.34357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.34373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.34378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.34385: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882906.34395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.34474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882906.34492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882906.34504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882906.34631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882906.36516: stdout chunk (state=3): >>>ansible-tmp-1726882906.33444-35134-257153363111357=/root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357 <<< 33932 1726882906.36685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882906.36688: stdout chunk (state=3): >>><<< 33932 1726882906.36696: stderr chunk (state=3): >>><<< 33932 1726882906.36713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882906.33444-35134-257153363111357=/root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882906.36758: variable 'ansible_module_compression' from source: unknown 33932 1726882906.36800: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 33932 1726882906.36833: variable 'ansible_facts' from source: unknown 33932 1726882906.36909: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357/AnsiballZ_ping.py 33932 1726882906.37039: Sending initial data 33932 1726882906.37042: Sent initial data (151 bytes) 33932 1726882906.37937: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882906.37946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.37956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.37973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.38010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.38016: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882906.38027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.38040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882906.38047: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882906.38053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882906.38061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.38075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.38087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.38095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.38102: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882906.38109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.38223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882906.38233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882906.38236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882906.38382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882906.40118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882906.40210: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882906.40335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpzwgosfiz /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357/AnsiballZ_ping.py <<< 33932 1726882906.40407: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882906.41686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882906.41895: stderr chunk (state=3): >>><<< 33932 1726882906.41898: stdout chunk (state=3): >>><<< 33932 1726882906.41900: done transferring module to remote 33932 1726882906.41903: _low_level_execute_command(): starting 33932 1726882906.41909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357/ /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357/AnsiballZ_ping.py && sleep 0' 33932 1726882906.42602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.42605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.42640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.42643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.42646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.43029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882906.43050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882906.43069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882906.43213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882906.46259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882906.46341: stderr chunk (state=3): >>><<< 33932 1726882906.46474: stdout chunk (state=3): >>><<< 33932 1726882906.46575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882906.46579: _low_level_execute_command(): starting 33932 1726882906.46582: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357/AnsiballZ_ping.py && sleep 0' 33932 1726882906.47410: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.47414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.47455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882906.47459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.47461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882906.47474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.47516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882906.48443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882906.48451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882906.48557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882906.61498: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 33932 1726882906.62547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882906.62551: stdout chunk (state=3): >>><<< 33932 1726882906.62556: stderr chunk (state=3): >>><<< 33932 1726882906.62579: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882906.62603: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882906.62612: _low_level_execute_command(): starting 33932 1726882906.62617: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882906.33444-35134-257153363111357/ > /dev/null 2>&1 && sleep 0' 33932 1726882906.64510: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882906.64527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.64543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.64578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.64624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.64679: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882906.64696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.64715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882906.64784: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882906.64797: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882906.64810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882906.64824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882906.64840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882906.64853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882906.64867: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882906.64888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882906.64962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882906.65126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882906.65143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882906.65276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882906.67084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882906.67179: stderr chunk (state=3): >>><<< 33932 1726882906.67182: stdout chunk (state=3): >>><<< 33932 1726882906.67479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882906.67483: handler run complete 33932 1726882906.67485: attempt loop complete, returning result 33932 1726882906.67487: _execute() done 33932 1726882906.67489: dumping result to json 33932 1726882906.67491: done dumping result, returning 33932 1726882906.67493: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-615b-5c48-00000000007a] 33932 1726882906.67495: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000007a 33932 1726882906.67565: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000007a 33932 1726882906.67571: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 33932 1726882906.67644: no more pending results, returning what we have 33932 1726882906.67647: results queue empty 33932 1726882906.67648: checking for any_errors_fatal 33932 1726882906.67655: done checking for any_errors_fatal 33932 1726882906.67655: checking for max_fail_percentage 33932 1726882906.67657: done checking for max_fail_percentage 33932 1726882906.67658: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.67659: done checking to see if all hosts have failed 33932 1726882906.67659: getting the remaining hosts for this loop 33932 1726882906.67661: done getting the remaining hosts for this loop 33932 1726882906.67667: getting the next task for host managed_node1 33932 1726882906.67678: done getting next task for host managed_node1 33932 1726882906.67681: ^ task is: TASK: meta (role_complete) 33932 1726882906.67684: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.67695: getting variables 33932 1726882906.67697: in VariableManager get_vars() 33932 1726882906.67742: Calling all_inventory to load vars for managed_node1 33932 1726882906.67745: Calling groups_inventory to load vars for managed_node1 33932 1726882906.67747: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.67758: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.67761: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.67765: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.69840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.73750: done with get_vars() 33932 1726882906.73782: done getting variables 33932 1726882906.73987: done queuing things up, now waiting for results queue to drain 33932 1726882906.73989: results queue empty 33932 1726882906.73990: checking for any_errors_fatal 33932 1726882906.73993: done checking for any_errors_fatal 33932 1726882906.73994: checking for max_fail_percentage 33932 1726882906.73995: done checking for max_fail_percentage 33932 1726882906.73996: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.73996: done checking to see if all hosts have failed 33932 1726882906.73997: getting the remaining hosts for this loop 33932 1726882906.73998: done getting the remaining hosts for this loop 33932 1726882906.74001: getting the next task for host managed_node1 33932 1726882906.74005: done getting next task for host managed_node1 33932 1726882906.74007: ^ task is: TASK: Include the task 'manage_test_interface.yml' 33932 1726882906.74009: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.74011: getting variables 33932 1726882906.74012: in VariableManager get_vars() 33932 1726882906.74027: Calling all_inventory to load vars for managed_node1 33932 1726882906.74029: Calling groups_inventory to load vars for managed_node1 33932 1726882906.74032: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.74149: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.74153: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.74156: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.76125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.80273: done with get_vars() 33932 1726882906.80303: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:73 Friday 20 September 2024 21:41:46 -0400 (0:00:00.522) 0:00:27.271 ****** 33932 1726882906.80385: entering _queue_task() for managed_node1/include_tasks 33932 1726882906.80990: worker is 1 (out of 1 available) 33932 1726882906.81004: exiting _queue_task() for managed_node1/include_tasks 33932 1726882906.81016: done queuing things up, now waiting for results queue to drain 33932 1726882906.81018: waiting for pending results... 33932 1726882906.81316: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 33932 1726882906.81433: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000aa 33932 1726882906.81454: variable 'ansible_search_path' from source: unknown 33932 1726882906.81508: calling self._execute() 33932 1726882906.81622: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.81634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.81649: variable 'omit' from source: magic vars 33932 1726882906.82069: variable 'ansible_distribution_major_version' from source: facts 33932 1726882906.82090: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882906.82102: _execute() done 33932 1726882906.82118: dumping result to json 33932 1726882906.82127: done dumping result, returning 33932 1726882906.82138: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-615b-5c48-0000000000aa] 33932 1726882906.82149: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000aa 33932 1726882906.82292: no more pending results, returning what we have 33932 1726882906.82298: in VariableManager get_vars() 33932 1726882906.82348: Calling all_inventory to load vars for managed_node1 33932 1726882906.82351: Calling groups_inventory to load vars for managed_node1 33932 1726882906.82354: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.82373: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.82377: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.82381: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.83404: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000aa 33932 1726882906.83407: WORKER PROCESS EXITING 33932 1726882906.84337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.86811: done with get_vars() 33932 1726882906.86832: variable 'ansible_search_path' from source: unknown 33932 1726882906.86846: we have included files to process 33932 1726882906.86847: generating all_blocks data 33932 1726882906.86849: done generating all_blocks data 33932 1726882906.86855: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 33932 1726882906.86856: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 33932 1726882906.86859: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 33932 1726882906.87358: in VariableManager get_vars() 33932 1726882906.87387: done with get_vars() 33932 1726882906.88084: done processing included file 33932 1726882906.88086: iterating over new_blocks loaded from include file 33932 1726882906.88087: in VariableManager get_vars() 33932 1726882906.88110: done with get_vars() 33932 1726882906.88112: filtering new block on tags 33932 1726882906.88144: done filtering new block on tags 33932 1726882906.88147: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 33932 1726882906.88152: extending task lists for all hosts with included blocks 33932 1726882906.92281: done extending task lists 33932 1726882906.92283: done processing included files 33932 1726882906.92284: results queue empty 33932 1726882906.92285: checking for any_errors_fatal 33932 1726882906.92287: done checking for any_errors_fatal 33932 1726882906.92288: checking for max_fail_percentage 33932 1726882906.92289: done checking for max_fail_percentage 33932 1726882906.92290: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.92291: done checking to see if all hosts have failed 33932 1726882906.92291: getting the remaining hosts for this loop 33932 1726882906.92293: done getting the remaining hosts for this loop 33932 1726882906.92296: getting the next task for host managed_node1 33932 1726882906.92299: done getting next task for host managed_node1 33932 1726882906.92301: ^ task is: TASK: Ensure state in ["present", "absent"] 33932 1726882906.92304: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.92307: getting variables 33932 1726882906.92308: in VariableManager get_vars() 33932 1726882906.92325: Calling all_inventory to load vars for managed_node1 33932 1726882906.92328: Calling groups_inventory to load vars for managed_node1 33932 1726882906.92330: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.92336: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.92339: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.92342: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.93867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882906.96225: done with get_vars() 33932 1726882906.96255: done getting variables 33932 1726882906.96308: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:41:46 -0400 (0:00:00.159) 0:00:27.431 ****** 33932 1726882906.96347: entering _queue_task() for managed_node1/fail 33932 1726882906.96714: worker is 1 (out of 1 available) 33932 1726882906.96731: exiting _queue_task() for managed_node1/fail 33932 1726882906.96746: done queuing things up, now waiting for results queue to drain 33932 1726882906.96748: waiting for pending results... 33932 1726882906.96953: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 33932 1726882906.97022: in run() - task 0e448fcc-3ce9-615b-5c48-00000000093c 33932 1726882906.97033: variable 'ansible_search_path' from source: unknown 33932 1726882906.97037: variable 'ansible_search_path' from source: unknown 33932 1726882906.97070: calling self._execute() 33932 1726882906.97147: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882906.97151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882906.97159: variable 'omit' from source: magic vars 33932 1726882906.97452: variable 'ansible_distribution_major_version' from source: facts 33932 1726882906.97462: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882906.97558: variable 'state' from source: include params 33932 1726882906.97562: Evaluated conditional (state not in ["present", "absent"]): False 33932 1726882906.97566: when evaluation is False, skipping this task 33932 1726882906.97572: _execute() done 33932 1726882906.97574: dumping result to json 33932 1726882906.97577: done dumping result, returning 33932 1726882906.97579: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-615b-5c48-00000000093c] 33932 1726882906.97586: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093c 33932 1726882906.97689: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093c 33932 1726882906.97692: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 33932 1726882906.97779: no more pending results, returning what we have 33932 1726882906.97783: results queue empty 33932 1726882906.97784: checking for any_errors_fatal 33932 1726882906.97786: done checking for any_errors_fatal 33932 1726882906.97786: checking for max_fail_percentage 33932 1726882906.97788: done checking for max_fail_percentage 33932 1726882906.97789: checking to see if all hosts have failed and the running result is not ok 33932 1726882906.97790: done checking to see if all hosts have failed 33932 1726882906.97791: getting the remaining hosts for this loop 33932 1726882906.97793: done getting the remaining hosts for this loop 33932 1726882906.97796: getting the next task for host managed_node1 33932 1726882906.97803: done getting next task for host managed_node1 33932 1726882906.97805: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 33932 1726882906.97809: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882906.97812: getting variables 33932 1726882906.97814: in VariableManager get_vars() 33932 1726882906.97864: Calling all_inventory to load vars for managed_node1 33932 1726882906.97871: Calling groups_inventory to load vars for managed_node1 33932 1726882906.97874: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882906.97890: Calling all_plugins_play to load vars for managed_node1 33932 1726882906.97894: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882906.97897: Calling groups_plugins_play to load vars for managed_node1 33932 1726882906.99517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.01245: done with get_vars() 33932 1726882907.01273: done getting variables 33932 1726882907.01356: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:41:47 -0400 (0:00:00.050) 0:00:27.481 ****** 33932 1726882907.01388: entering _queue_task() for managed_node1/fail 33932 1726882907.01645: worker is 1 (out of 1 available) 33932 1726882907.01660: exiting _queue_task() for managed_node1/fail 33932 1726882907.01676: done queuing things up, now waiting for results queue to drain 33932 1726882907.01678: waiting for pending results... 33932 1726882907.01836: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 33932 1726882907.01926: in run() - task 0e448fcc-3ce9-615b-5c48-00000000093d 33932 1726882907.01929: variable 'ansible_search_path' from source: unknown 33932 1726882907.01932: variable 'ansible_search_path' from source: unknown 33932 1726882907.01965: calling self._execute() 33932 1726882907.02173: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.02177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.02180: variable 'omit' from source: magic vars 33932 1726882907.02491: variable 'ansible_distribution_major_version' from source: facts 33932 1726882907.02504: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882907.02650: variable 'type' from source: play vars 33932 1726882907.02655: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 33932 1726882907.02658: when evaluation is False, skipping this task 33932 1726882907.02660: _execute() done 33932 1726882907.02666: dumping result to json 33932 1726882907.02672: done dumping result, returning 33932 1726882907.02677: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-615b-5c48-00000000093d] 33932 1726882907.02685: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093d 33932 1726882907.02774: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093d 33932 1726882907.02777: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 33932 1726882907.02822: no more pending results, returning what we have 33932 1726882907.02826: results queue empty 33932 1726882907.02827: checking for any_errors_fatal 33932 1726882907.02834: done checking for any_errors_fatal 33932 1726882907.02834: checking for max_fail_percentage 33932 1726882907.02837: done checking for max_fail_percentage 33932 1726882907.02838: checking to see if all hosts have failed and the running result is not ok 33932 1726882907.02839: done checking to see if all hosts have failed 33932 1726882907.02840: getting the remaining hosts for this loop 33932 1726882907.02841: done getting the remaining hosts for this loop 33932 1726882907.02845: getting the next task for host managed_node1 33932 1726882907.02850: done getting next task for host managed_node1 33932 1726882907.02854: ^ task is: TASK: Include the task 'show_interfaces.yml' 33932 1726882907.02857: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882907.02860: getting variables 33932 1726882907.02862: in VariableManager get_vars() 33932 1726882907.02900: Calling all_inventory to load vars for managed_node1 33932 1726882907.02903: Calling groups_inventory to load vars for managed_node1 33932 1726882907.02905: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.02915: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.02917: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.02920: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.08599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.10110: done with get_vars() 33932 1726882907.10167: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:41:47 -0400 (0:00:00.088) 0:00:27.570 ****** 33932 1726882907.10245: entering _queue_task() for managed_node1/include_tasks 33932 1726882907.10599: worker is 1 (out of 1 available) 33932 1726882907.10612: exiting _queue_task() for managed_node1/include_tasks 33932 1726882907.10623: done queuing things up, now waiting for results queue to drain 33932 1726882907.10625: waiting for pending results... 33932 1726882907.11017: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 33932 1726882907.11147: in run() - task 0e448fcc-3ce9-615b-5c48-00000000093e 33932 1726882907.11157: variable 'ansible_search_path' from source: unknown 33932 1726882907.11161: variable 'ansible_search_path' from source: unknown 33932 1726882907.11203: calling self._execute() 33932 1726882907.11289: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.11293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.11302: variable 'omit' from source: magic vars 33932 1726882907.11598: variable 'ansible_distribution_major_version' from source: facts 33932 1726882907.11610: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882907.11614: _execute() done 33932 1726882907.11618: dumping result to json 33932 1726882907.11621: done dumping result, returning 33932 1726882907.11627: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-615b-5c48-00000000093e] 33932 1726882907.11632: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093e 33932 1726882907.11718: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093e 33932 1726882907.11723: WORKER PROCESS EXITING 33932 1726882907.11750: no more pending results, returning what we have 33932 1726882907.11754: in VariableManager get_vars() 33932 1726882907.11801: Calling all_inventory to load vars for managed_node1 33932 1726882907.11804: Calling groups_inventory to load vars for managed_node1 33932 1726882907.11806: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.11818: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.11821: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.11823: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.12603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.14187: done with get_vars() 33932 1726882907.14212: variable 'ansible_search_path' from source: unknown 33932 1726882907.14213: variable 'ansible_search_path' from source: unknown 33932 1726882907.14238: we have included files to process 33932 1726882907.14238: generating all_blocks data 33932 1726882907.14240: done generating all_blocks data 33932 1726882907.14243: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882907.14244: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882907.14245: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 33932 1726882907.14318: in VariableManager get_vars() 33932 1726882907.14336: done with get_vars() 33932 1726882907.14421: done processing included file 33932 1726882907.14423: iterating over new_blocks loaded from include file 33932 1726882907.14423: in VariableManager get_vars() 33932 1726882907.14438: done with get_vars() 33932 1726882907.14439: filtering new block on tags 33932 1726882907.14451: done filtering new block on tags 33932 1726882907.14453: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 33932 1726882907.14456: extending task lists for all hosts with included blocks 33932 1726882907.14696: done extending task lists 33932 1726882907.14697: done processing included files 33932 1726882907.14698: results queue empty 33932 1726882907.14698: checking for any_errors_fatal 33932 1726882907.14701: done checking for any_errors_fatal 33932 1726882907.14702: checking for max_fail_percentage 33932 1726882907.14702: done checking for max_fail_percentage 33932 1726882907.14703: checking to see if all hosts have failed and the running result is not ok 33932 1726882907.14703: done checking to see if all hosts have failed 33932 1726882907.14704: getting the remaining hosts for this loop 33932 1726882907.14705: done getting the remaining hosts for this loop 33932 1726882907.14706: getting the next task for host managed_node1 33932 1726882907.14709: done getting next task for host managed_node1 33932 1726882907.14710: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 33932 1726882907.14712: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882907.14714: getting variables 33932 1726882907.14714: in VariableManager get_vars() 33932 1726882907.14723: Calling all_inventory to load vars for managed_node1 33932 1726882907.14724: Calling groups_inventory to load vars for managed_node1 33932 1726882907.14726: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.14730: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.14731: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.14733: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.15406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.16315: done with get_vars() 33932 1726882907.16329: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:41:47 -0400 (0:00:00.061) 0:00:27.631 ****** 33932 1726882907.16383: entering _queue_task() for managed_node1/include_tasks 33932 1726882907.16607: worker is 1 (out of 1 available) 33932 1726882907.16621: exiting _queue_task() for managed_node1/include_tasks 33932 1726882907.16635: done queuing things up, now waiting for results queue to drain 33932 1726882907.16637: waiting for pending results... 33932 1726882907.16815: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 33932 1726882907.16889: in run() - task 0e448fcc-3ce9-615b-5c48-000000000aa0 33932 1726882907.16900: variable 'ansible_search_path' from source: unknown 33932 1726882907.16903: variable 'ansible_search_path' from source: unknown 33932 1726882907.16935: calling self._execute() 33932 1726882907.17008: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.17011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.17020: variable 'omit' from source: magic vars 33932 1726882907.17314: variable 'ansible_distribution_major_version' from source: facts 33932 1726882907.17325: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882907.17330: _execute() done 33932 1726882907.17333: dumping result to json 33932 1726882907.17337: done dumping result, returning 33932 1726882907.17343: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-615b-5c48-000000000aa0] 33932 1726882907.17348: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000aa0 33932 1726882907.17434: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000aa0 33932 1726882907.17437: WORKER PROCESS EXITING 33932 1726882907.17466: no more pending results, returning what we have 33932 1726882907.17473: in VariableManager get_vars() 33932 1726882907.17522: Calling all_inventory to load vars for managed_node1 33932 1726882907.17524: Calling groups_inventory to load vars for managed_node1 33932 1726882907.17527: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.17538: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.17540: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.17543: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.18426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.19388: done with get_vars() 33932 1726882907.19402: variable 'ansible_search_path' from source: unknown 33932 1726882907.19402: variable 'ansible_search_path' from source: unknown 33932 1726882907.19442: we have included files to process 33932 1726882907.19443: generating all_blocks data 33932 1726882907.19444: done generating all_blocks data 33932 1726882907.19444: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882907.19445: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882907.19446: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 33932 1726882907.19677: done processing included file 33932 1726882907.19679: iterating over new_blocks loaded from include file 33932 1726882907.19680: in VariableManager get_vars() 33932 1726882907.19700: done with get_vars() 33932 1726882907.19701: filtering new block on tags 33932 1726882907.19719: done filtering new block on tags 33932 1726882907.19721: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 33932 1726882907.19726: extending task lists for all hosts with included blocks 33932 1726882907.19879: done extending task lists 33932 1726882907.19880: done processing included files 33932 1726882907.19881: results queue empty 33932 1726882907.19882: checking for any_errors_fatal 33932 1726882907.19885: done checking for any_errors_fatal 33932 1726882907.19886: checking for max_fail_percentage 33932 1726882907.19887: done checking for max_fail_percentage 33932 1726882907.19887: checking to see if all hosts have failed and the running result is not ok 33932 1726882907.19888: done checking to see if all hosts have failed 33932 1726882907.19889: getting the remaining hosts for this loop 33932 1726882907.19890: done getting the remaining hosts for this loop 33932 1726882907.19893: getting the next task for host managed_node1 33932 1726882907.19896: done getting next task for host managed_node1 33932 1726882907.19898: ^ task is: TASK: Gather current interface info 33932 1726882907.19901: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882907.19904: getting variables 33932 1726882907.19905: in VariableManager get_vars() 33932 1726882907.19917: Calling all_inventory to load vars for managed_node1 33932 1726882907.19919: Calling groups_inventory to load vars for managed_node1 33932 1726882907.19921: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.19926: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.19928: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.19930: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.21035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.22003: done with get_vars() 33932 1726882907.22016: done getting variables 33932 1726882907.22044: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:41:47 -0400 (0:00:00.056) 0:00:27.688 ****** 33932 1726882907.22070: entering _queue_task() for managed_node1/command 33932 1726882907.22304: worker is 1 (out of 1 available) 33932 1726882907.22318: exiting _queue_task() for managed_node1/command 33932 1726882907.22328: done queuing things up, now waiting for results queue to drain 33932 1726882907.22330: waiting for pending results... 33932 1726882907.22509: running TaskExecutor() for managed_node1/TASK: Gather current interface info 33932 1726882907.22591: in run() - task 0e448fcc-3ce9-615b-5c48-000000000ad7 33932 1726882907.22600: variable 'ansible_search_path' from source: unknown 33932 1726882907.22604: variable 'ansible_search_path' from source: unknown 33932 1726882907.22631: calling self._execute() 33932 1726882907.22712: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.22716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.22725: variable 'omit' from source: magic vars 33932 1726882907.23126: variable 'ansible_distribution_major_version' from source: facts 33932 1726882907.23152: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882907.23161: variable 'omit' from source: magic vars 33932 1726882907.23215: variable 'omit' from source: magic vars 33932 1726882907.23260: variable 'omit' from source: magic vars 33932 1726882907.23306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882907.23348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882907.23384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882907.23409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.23774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.23778: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882907.23780: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.23783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.23785: Set connection var ansible_shell_executable to /bin/sh 33932 1726882907.23788: Set connection var ansible_timeout to 10 33932 1726882907.23790: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882907.23792: Set connection var ansible_pipelining to False 33932 1726882907.23794: Set connection var ansible_connection to ssh 33932 1726882907.23796: Set connection var ansible_shell_type to sh 33932 1726882907.23798: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.23800: variable 'ansible_connection' from source: unknown 33932 1726882907.23803: variable 'ansible_module_compression' from source: unknown 33932 1726882907.23805: variable 'ansible_shell_type' from source: unknown 33932 1726882907.23809: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.23811: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.23813: variable 'ansible_pipelining' from source: unknown 33932 1726882907.23816: variable 'ansible_timeout' from source: unknown 33932 1726882907.23817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.23820: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882907.23823: variable 'omit' from source: magic vars 33932 1726882907.23825: starting attempt loop 33932 1726882907.23827: running the handler 33932 1726882907.23829: _low_level_execute_command(): starting 33932 1726882907.23831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882907.24476: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882907.24488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.24498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.24513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.24549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.24556: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882907.24567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.24581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882907.24589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882907.24595: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882907.24603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.24613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.24623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.24630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.24638: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882907.24648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.24721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.24740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.24753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.24885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.26556: stdout chunk (state=3): >>>/root <<< 33932 1726882907.26680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.26750: stderr chunk (state=3): >>><<< 33932 1726882907.26753: stdout chunk (state=3): >>><<< 33932 1726882907.26781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882907.26794: _low_level_execute_command(): starting 33932 1726882907.26800: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536 `" && echo ansible-tmp-1726882907.2678006-35178-228635095810536="` echo /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536 `" ) && sleep 0' 33932 1726882907.27493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882907.27501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.27512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.27815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.27824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.27827: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882907.27829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.27831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882907.27833: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882907.27834: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882907.27836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.27838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.27839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.27841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.27843: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882907.27844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.27846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.27848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.27849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.28021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.29874: stdout chunk (state=3): >>>ansible-tmp-1726882907.2678006-35178-228635095810536=/root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536 <<< 33932 1726882907.30036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.30039: stdout chunk (state=3): >>><<< 33932 1726882907.30047: stderr chunk (state=3): >>><<< 33932 1726882907.30066: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882907.2678006-35178-228635095810536=/root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882907.30096: variable 'ansible_module_compression' from source: unknown 33932 1726882907.30151: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882907.30188: variable 'ansible_facts' from source: unknown 33932 1726882907.30273: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536/AnsiballZ_command.py 33932 1726882907.30403: Sending initial data 33932 1726882907.30408: Sent initial data (156 bytes) 33932 1726882907.31314: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882907.31323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.31332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.31345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.31384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.31393: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882907.31401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.31414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882907.31421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882907.31428: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882907.31436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.31445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.31456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.31465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.31477: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882907.31487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.31557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.31575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.31586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.31705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.33418: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882907.33517: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882907.33610: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp5mvhtn1_ /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536/AnsiballZ_command.py <<< 33932 1726882907.33706: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882907.35090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.35174: stderr chunk (state=3): >>><<< 33932 1726882907.35178: stdout chunk (state=3): >>><<< 33932 1726882907.35197: done transferring module to remote 33932 1726882907.35203: _low_level_execute_command(): starting 33932 1726882907.35210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536/ /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536/AnsiballZ_command.py && sleep 0' 33932 1726882907.35805: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.35810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.35843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.35848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.35861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.35871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.35928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.35931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.35942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.36039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.37785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.37834: stderr chunk (state=3): >>><<< 33932 1726882907.37838: stdout chunk (state=3): >>><<< 33932 1726882907.37853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882907.37856: _low_level_execute_command(): starting 33932 1726882907.37861: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536/AnsiballZ_command.py && sleep 0' 33932 1726882907.38533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.38544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.38591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882907.38595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.38597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.38648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.38659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.38775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.52045: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:47.515817", "end": "2024-09-20 21:41:47.519008", "delta": "0:00:00.003191", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882907.53220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882907.53244: stderr chunk (state=3): >>><<< 33932 1726882907.53247: stdout chunk (state=3): >>><<< 33932 1726882907.53391: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:47.515817", "end": "2024-09-20 21:41:47.519008", "delta": "0:00:00.003191", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882907.53395: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882907.53402: _low_level_execute_command(): starting 33932 1726882907.53404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882907.2678006-35178-228635095810536/ > /dev/null 2>&1 && sleep 0' 33932 1726882907.53975: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882907.53990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.54004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.54021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.54062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.54081: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882907.54095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.54112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882907.54125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882907.54137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882907.54149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.54161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.54185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.54197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.54207: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882907.54219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.54303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.54319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.54334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.54462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.56323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.56326: stdout chunk (state=3): >>><<< 33932 1726882907.56328: stderr chunk (state=3): >>><<< 33932 1726882907.56341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882907.56347: handler run complete 33932 1726882907.56374: Evaluated conditional (False): False 33932 1726882907.56385: attempt loop complete, returning result 33932 1726882907.56388: _execute() done 33932 1726882907.56390: dumping result to json 33932 1726882907.56395: done dumping result, returning 33932 1726882907.56403: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-615b-5c48-000000000ad7] 33932 1726882907.56409: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000ad7 33932 1726882907.56514: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000ad7 33932 1726882907.56516: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003191", "end": "2024-09-20 21:41:47.519008", "rc": 0, "start": "2024-09-20 21:41:47.515817" } STDOUT: bonding_masters eth0 lo lsr101 peerlsr101 33932 1726882907.56606: no more pending results, returning what we have 33932 1726882907.56609: results queue empty 33932 1726882907.56610: checking for any_errors_fatal 33932 1726882907.56612: done checking for any_errors_fatal 33932 1726882907.56613: checking for max_fail_percentage 33932 1726882907.56614: done checking for max_fail_percentage 33932 1726882907.56615: checking to see if all hosts have failed and the running result is not ok 33932 1726882907.56616: done checking to see if all hosts have failed 33932 1726882907.56617: getting the remaining hosts for this loop 33932 1726882907.56619: done getting the remaining hosts for this loop 33932 1726882907.56622: getting the next task for host managed_node1 33932 1726882907.56629: done getting next task for host managed_node1 33932 1726882907.56632: ^ task is: TASK: Set current_interfaces 33932 1726882907.56637: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882907.56641: getting variables 33932 1726882907.56642: in VariableManager get_vars() 33932 1726882907.56688: Calling all_inventory to load vars for managed_node1 33932 1726882907.56691: Calling groups_inventory to load vars for managed_node1 33932 1726882907.56693: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.56702: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.56705: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.56707: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.58305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.60145: done with get_vars() 33932 1726882907.60171: done getting variables 33932 1726882907.60232: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:41:47 -0400 (0:00:00.381) 0:00:28.070 ****** 33932 1726882907.60271: entering _queue_task() for managed_node1/set_fact 33932 1726882907.60561: worker is 1 (out of 1 available) 33932 1726882907.60575: exiting _queue_task() for managed_node1/set_fact 33932 1726882907.60586: done queuing things up, now waiting for results queue to drain 33932 1726882907.60588: waiting for pending results... 33932 1726882907.60888: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 33932 1726882907.61036: in run() - task 0e448fcc-3ce9-615b-5c48-000000000ad8 33932 1726882907.61057: variable 'ansible_search_path' from source: unknown 33932 1726882907.61071: variable 'ansible_search_path' from source: unknown 33932 1726882907.61119: calling self._execute() 33932 1726882907.61229: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.61242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.61261: variable 'omit' from source: magic vars 33932 1726882907.61694: variable 'ansible_distribution_major_version' from source: facts 33932 1726882907.61713: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882907.61724: variable 'omit' from source: magic vars 33932 1726882907.61800: variable 'omit' from source: magic vars 33932 1726882907.61940: variable '_current_interfaces' from source: set_fact 33932 1726882907.62189: variable 'omit' from source: magic vars 33932 1726882907.62244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882907.62290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882907.62320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882907.62347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.62367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.62404: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882907.62418: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.62427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.62553: Set connection var ansible_shell_executable to /bin/sh 33932 1726882907.62573: Set connection var ansible_timeout to 10 33932 1726882907.62585: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882907.62595: Set connection var ansible_pipelining to False 33932 1726882907.62602: Set connection var ansible_connection to ssh 33932 1726882907.62608: Set connection var ansible_shell_type to sh 33932 1726882907.62642: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.62652: variable 'ansible_connection' from source: unknown 33932 1726882907.62662: variable 'ansible_module_compression' from source: unknown 33932 1726882907.62675: variable 'ansible_shell_type' from source: unknown 33932 1726882907.62684: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.62691: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.62698: variable 'ansible_pipelining' from source: unknown 33932 1726882907.62704: variable 'ansible_timeout' from source: unknown 33932 1726882907.62711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.62859: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882907.62882: variable 'omit' from source: magic vars 33932 1726882907.62892: starting attempt loop 33932 1726882907.62900: running the handler 33932 1726882907.62914: handler run complete 33932 1726882907.62927: attempt loop complete, returning result 33932 1726882907.62933: _execute() done 33932 1726882907.62939: dumping result to json 33932 1726882907.62946: done dumping result, returning 33932 1726882907.62958: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-615b-5c48-000000000ad8] 33932 1726882907.62976: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000ad8 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "lsr101", "peerlsr101" ] }, "changed": false } 33932 1726882907.63181: no more pending results, returning what we have 33932 1726882907.63185: results queue empty 33932 1726882907.63186: checking for any_errors_fatal 33932 1726882907.63197: done checking for any_errors_fatal 33932 1726882907.63198: checking for max_fail_percentage 33932 1726882907.63200: done checking for max_fail_percentage 33932 1726882907.63201: checking to see if all hosts have failed and the running result is not ok 33932 1726882907.63202: done checking to see if all hosts have failed 33932 1726882907.63203: getting the remaining hosts for this loop 33932 1726882907.63205: done getting the remaining hosts for this loop 33932 1726882907.63209: getting the next task for host managed_node1 33932 1726882907.63222: done getting next task for host managed_node1 33932 1726882907.63225: ^ task is: TASK: Show current_interfaces 33932 1726882907.63229: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882907.63234: getting variables 33932 1726882907.63236: in VariableManager get_vars() 33932 1726882907.63285: Calling all_inventory to load vars for managed_node1 33932 1726882907.63289: Calling groups_inventory to load vars for managed_node1 33932 1726882907.63292: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.63304: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.63307: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.63310: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.64554: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000ad8 33932 1726882907.64557: WORKER PROCESS EXITING 33932 1726882907.65752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.68591: done with get_vars() 33932 1726882907.68619: done getting variables 33932 1726882907.68774: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:41:47 -0400 (0:00:00.085) 0:00:28.155 ****** 33932 1726882907.68811: entering _queue_task() for managed_node1/debug 33932 1726882907.69170: worker is 1 (out of 1 available) 33932 1726882907.69183: exiting _queue_task() for managed_node1/debug 33932 1726882907.69193: done queuing things up, now waiting for results queue to drain 33932 1726882907.69195: waiting for pending results... 33932 1726882907.70326: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 33932 1726882907.70457: in run() - task 0e448fcc-3ce9-615b-5c48-000000000aa1 33932 1726882907.70541: variable 'ansible_search_path' from source: unknown 33932 1726882907.70550: variable 'ansible_search_path' from source: unknown 33932 1726882907.70601: calling self._execute() 33932 1726882907.70961: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.70979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.70995: variable 'omit' from source: magic vars 33932 1726882907.71856: variable 'ansible_distribution_major_version' from source: facts 33932 1726882907.71884: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882907.71897: variable 'omit' from source: magic vars 33932 1726882907.71954: variable 'omit' from source: magic vars 33932 1726882907.72259: variable 'current_interfaces' from source: set_fact 33932 1726882907.72304: variable 'omit' from source: magic vars 33932 1726882907.72346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882907.72402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882907.72493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882907.72514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.72587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.72619: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882907.72693: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.72701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.72917: Set connection var ansible_shell_executable to /bin/sh 33932 1726882907.72927: Set connection var ansible_timeout to 10 33932 1726882907.72935: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882907.72943: Set connection var ansible_pipelining to False 33932 1726882907.72948: Set connection var ansible_connection to ssh 33932 1726882907.72952: Set connection var ansible_shell_type to sh 33932 1726882907.72983: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.72989: variable 'ansible_connection' from source: unknown 33932 1726882907.72995: variable 'ansible_module_compression' from source: unknown 33932 1726882907.73001: variable 'ansible_shell_type' from source: unknown 33932 1726882907.73013: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.73125: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.73135: variable 'ansible_pipelining' from source: unknown 33932 1726882907.73142: variable 'ansible_timeout' from source: unknown 33932 1726882907.73154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.73303: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882907.73457: variable 'omit' from source: magic vars 33932 1726882907.73473: starting attempt loop 33932 1726882907.73482: running the handler 33932 1726882907.73534: handler run complete 33932 1726882907.73679: attempt loop complete, returning result 33932 1726882907.73687: _execute() done 33932 1726882907.73694: dumping result to json 33932 1726882907.73702: done dumping result, returning 33932 1726882907.73714: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-615b-5c48-000000000aa1] 33932 1726882907.73723: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000aa1 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'lsr101', 'peerlsr101'] 33932 1726882907.73881: no more pending results, returning what we have 33932 1726882907.73885: results queue empty 33932 1726882907.73886: checking for any_errors_fatal 33932 1726882907.73893: done checking for any_errors_fatal 33932 1726882907.73894: checking for max_fail_percentage 33932 1726882907.73896: done checking for max_fail_percentage 33932 1726882907.73897: checking to see if all hosts have failed and the running result is not ok 33932 1726882907.73898: done checking to see if all hosts have failed 33932 1726882907.73899: getting the remaining hosts for this loop 33932 1726882907.73901: done getting the remaining hosts for this loop 33932 1726882907.73904: getting the next task for host managed_node1 33932 1726882907.73913: done getting next task for host managed_node1 33932 1726882907.73918: ^ task is: TASK: Install iproute 33932 1726882907.73921: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882907.73925: getting variables 33932 1726882907.73927: in VariableManager get_vars() 33932 1726882907.73977: Calling all_inventory to load vars for managed_node1 33932 1726882907.73980: Calling groups_inventory to load vars for managed_node1 33932 1726882907.73983: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882907.73996: Calling all_plugins_play to load vars for managed_node1 33932 1726882907.73999: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882907.74002: Calling groups_plugins_play to load vars for managed_node1 33932 1726882907.74984: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000aa1 33932 1726882907.74987: WORKER PROCESS EXITING 33932 1726882907.75781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882907.77482: done with get_vars() 33932 1726882907.77503: done getting variables 33932 1726882907.77551: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:41:47 -0400 (0:00:00.087) 0:00:28.243 ****** 33932 1726882907.77584: entering _queue_task() for managed_node1/package 33932 1726882907.77862: worker is 1 (out of 1 available) 33932 1726882907.77879: exiting _queue_task() for managed_node1/package 33932 1726882907.77890: done queuing things up, now waiting for results queue to drain 33932 1726882907.77892: waiting for pending results... 33932 1726882907.78495: running TaskExecutor() for managed_node1/TASK: Install iproute 33932 1726882907.78644: in run() - task 0e448fcc-3ce9-615b-5c48-00000000093f 33932 1726882907.78786: variable 'ansible_search_path' from source: unknown 33932 1726882907.78793: variable 'ansible_search_path' from source: unknown 33932 1726882907.78831: calling self._execute() 33932 1726882907.78972: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.79097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.79112: variable 'omit' from source: magic vars 33932 1726882907.79834: variable 'ansible_distribution_major_version' from source: facts 33932 1726882907.79977: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882907.79989: variable 'omit' from source: magic vars 33932 1726882907.80030: variable 'omit' from source: magic vars 33932 1726882907.80370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33932 1726882907.84110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33932 1726882907.84187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33932 1726882907.84226: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33932 1726882907.84272: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33932 1726882907.84321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33932 1726882907.84423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33932 1726882907.84455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33932 1726882907.84488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33932 1726882907.84534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33932 1726882907.84553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33932 1726882907.84666: variable '__network_is_ostree' from source: set_fact 33932 1726882907.84680: variable 'omit' from source: magic vars 33932 1726882907.84712: variable 'omit' from source: magic vars 33932 1726882907.84747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882907.84781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882907.84802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882907.84822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.84839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882907.84874: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882907.84882: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.84889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.84991: Set connection var ansible_shell_executable to /bin/sh 33932 1726882907.85003: Set connection var ansible_timeout to 10 33932 1726882907.85012: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882907.85020: Set connection var ansible_pipelining to False 33932 1726882907.85026: Set connection var ansible_connection to ssh 33932 1726882907.85031: Set connection var ansible_shell_type to sh 33932 1726882907.85062: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.85074: variable 'ansible_connection' from source: unknown 33932 1726882907.85081: variable 'ansible_module_compression' from source: unknown 33932 1726882907.85088: variable 'ansible_shell_type' from source: unknown 33932 1726882907.85093: variable 'ansible_shell_executable' from source: unknown 33932 1726882907.85099: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882907.85106: variable 'ansible_pipelining' from source: unknown 33932 1726882907.85111: variable 'ansible_timeout' from source: unknown 33932 1726882907.85117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882907.85216: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882907.85230: variable 'omit' from source: magic vars 33932 1726882907.85239: starting attempt loop 33932 1726882907.85245: running the handler 33932 1726882907.85253: variable 'ansible_facts' from source: unknown 33932 1726882907.85261: variable 'ansible_facts' from source: unknown 33932 1726882907.85301: _low_level_execute_command(): starting 33932 1726882907.85312: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882907.86029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882907.86047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.86062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.86088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.86131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.86141: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882907.86157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.86180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882907.86191: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882907.86201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882907.86211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.86223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.86237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.86247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.86258: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882907.86281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.86357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.86398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.86416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.86583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.88227: stdout chunk (state=3): >>>/root <<< 33932 1726882907.88384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.88432: stderr chunk (state=3): >>><<< 33932 1726882907.88435: stdout chunk (state=3): >>><<< 33932 1726882907.88544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882907.88548: _low_level_execute_command(): starting 33932 1726882907.88550: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450 `" && echo ansible-tmp-1726882907.884528-35210-237123902139450="` echo /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450 `" ) && sleep 0' 33932 1726882907.90077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882907.90133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.90144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.90245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.90287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.90295: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882907.90302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.90316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882907.90323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882907.90329: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882907.90338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.90350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.90361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.90371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882907.90374: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882907.90384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.90456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.90588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.90600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.90729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.92589: stdout chunk (state=3): >>>ansible-tmp-1726882907.884528-35210-237123902139450=/root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450 <<< 33932 1726882907.92711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.92804: stderr chunk (state=3): >>><<< 33932 1726882907.92812: stdout chunk (state=3): >>><<< 33932 1726882907.92835: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882907.884528-35210-237123902139450=/root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882907.92899: variable 'ansible_module_compression' from source: unknown 33932 1726882907.92969: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 33932 1726882907.93011: variable 'ansible_facts' from source: unknown 33932 1726882907.93113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450/AnsiballZ_dnf.py 33932 1726882907.93724: Sending initial data 33932 1726882907.93727: Sent initial data (151 bytes) 33932 1726882907.95129: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882907.95145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.95189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.95196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.95263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882907.95274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.95300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.95304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882907.95318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.95400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882907.95419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.95442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882907.95562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882907.97409: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882907.97572: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882907.97576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpsfkgqtfs /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450/AnsiballZ_dnf.py <<< 33932 1726882907.97673: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882907.99286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882907.99390: stderr chunk (state=3): >>><<< 33932 1726882907.99394: stdout chunk (state=3): >>><<< 33932 1726882907.99412: done transferring module to remote 33932 1726882907.99421: _low_level_execute_command(): starting 33932 1726882907.99426: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450/ /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450/AnsiballZ_dnf.py && sleep 0' 33932 1726882907.99856: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882907.99862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882907.99909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.99913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882907.99915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882907.99968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882907.99974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882908.00082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882908.01931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882908.01934: stdout chunk (state=3): >>><<< 33932 1726882908.01937: stderr chunk (state=3): >>><<< 33932 1726882908.01939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882908.01942: _low_level_execute_command(): starting 33932 1726882908.01944: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450/AnsiballZ_dnf.py && sleep 0' 33932 1726882908.02739: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882908.02751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882908.02757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882908.02776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882908.02813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882908.02820: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882908.02830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882908.02846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882908.02850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882908.02875: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882908.02878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882908.02882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882908.02895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882908.02902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882908.02908: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882908.02917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882908.02996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882908.03009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882908.03019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882908.03502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.03974: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 33932 1726882909.09683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882909.09687: stdout chunk (state=3): >>><<< 33932 1726882909.09689: stderr chunk (state=3): >>><<< 33932 1726882909.09832: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882909.09837: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882909.09840: _low_level_execute_command(): starting 33932 1726882909.09842: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882907.884528-35210-237123902139450/ > /dev/null 2>&1 && sleep 0' 33932 1726882909.10424: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882909.10437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.10451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.10475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.10516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.10528: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882909.10540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.10557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882909.10573: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882909.10584: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882909.10594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.10606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.10622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.10632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.10642: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882909.10655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.10735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882909.10751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882909.10769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882909.10897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.12748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882909.12819: stderr chunk (state=3): >>><<< 33932 1726882909.12829: stdout chunk (state=3): >>><<< 33932 1726882909.13468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882909.13472: handler run complete 33932 1726882909.13474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33932 1726882909.13476: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33932 1726882909.13479: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33932 1726882909.13481: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33932 1726882909.13483: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33932 1726882909.13485: variable '__install_status' from source: set_fact 33932 1726882909.13487: Evaluated conditional (__install_status is success): True 33932 1726882909.13488: attempt loop complete, returning result 33932 1726882909.13490: _execute() done 33932 1726882909.13492: dumping result to json 33932 1726882909.13494: done dumping result, returning 33932 1726882909.13496: done running TaskExecutor() for managed_node1/TASK: Install iproute [0e448fcc-3ce9-615b-5c48-00000000093f] 33932 1726882909.13501: sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093f ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 33932 1726882909.13651: no more pending results, returning what we have 33932 1726882909.13655: results queue empty 33932 1726882909.13656: checking for any_errors_fatal 33932 1726882909.13665: done checking for any_errors_fatal 33932 1726882909.13666: checking for max_fail_percentage 33932 1726882909.13670: done checking for max_fail_percentage 33932 1726882909.13671: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.13672: done checking to see if all hosts have failed 33932 1726882909.13672: getting the remaining hosts for this loop 33932 1726882909.13674: done getting the remaining hosts for this loop 33932 1726882909.13678: getting the next task for host managed_node1 33932 1726882909.13683: done getting next task for host managed_node1 33932 1726882909.13686: ^ task is: TASK: Create veth interface {{ interface }} 33932 1726882909.13689: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.13692: getting variables 33932 1726882909.13694: in VariableManager get_vars() 33932 1726882909.13740: Calling all_inventory to load vars for managed_node1 33932 1726882909.13743: Calling groups_inventory to load vars for managed_node1 33932 1726882909.13745: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.13757: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.13759: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.13763: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.14333: done sending task result for task 0e448fcc-3ce9-615b-5c48-00000000093f 33932 1726882909.14337: WORKER PROCESS EXITING 33932 1726882909.15405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.17188: done with get_vars() 33932 1726882909.17210: done getting variables 33932 1726882909.17281: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882909.17417: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:41:49 -0400 (0:00:01.398) 0:00:29.642 ****** 33932 1726882909.17448: entering _queue_task() for managed_node1/command 33932 1726882909.17784: worker is 1 (out of 1 available) 33932 1726882909.17797: exiting _queue_task() for managed_node1/command 33932 1726882909.17808: done queuing things up, now waiting for results queue to drain 33932 1726882909.17810: waiting for pending results... 33932 1726882909.18126: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 33932 1726882909.18251: in run() - task 0e448fcc-3ce9-615b-5c48-000000000940 33932 1726882909.18281: variable 'ansible_search_path' from source: unknown 33932 1726882909.18291: variable 'ansible_search_path' from source: unknown 33932 1726882909.18595: variable 'interface' from source: play vars 33932 1726882909.18700: variable 'interface' from source: play vars 33932 1726882909.18784: variable 'interface' from source: play vars 33932 1726882909.18950: Loaded config def from plugin (lookup/items) 33932 1726882909.18967: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 33932 1726882909.18999: variable 'omit' from source: magic vars 33932 1726882909.19135: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.19147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.19159: variable 'omit' from source: magic vars 33932 1726882909.19431: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.19442: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.19836: variable 'type' from source: play vars 33932 1726882909.19851: variable 'state' from source: include params 33932 1726882909.19859: variable 'interface' from source: play vars 33932 1726882909.19871: variable 'current_interfaces' from source: set_fact 33932 1726882909.19882: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 33932 1726882909.19888: when evaluation is False, skipping this task 33932 1726882909.19916: variable 'item' from source: unknown 33932 1726882909.20024: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add lsr101 type veth peer name peerlsr101", "skip_reason": "Conditional result was False" } 33932 1726882909.20398: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.20462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.20484: variable 'omit' from source: magic vars 33932 1726882909.20753: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.20892: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.21226: variable 'type' from source: play vars 33932 1726882909.21279: variable 'state' from source: include params 33932 1726882909.21288: variable 'interface' from source: play vars 33932 1726882909.21295: variable 'current_interfaces' from source: set_fact 33932 1726882909.21304: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 33932 1726882909.21316: when evaluation is False, skipping this task 33932 1726882909.21350: variable 'item' from source: unknown 33932 1726882909.21530: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerlsr101 up", "skip_reason": "Conditional result was False" } 33932 1726882909.21822: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.21917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.21940: variable 'omit' from source: magic vars 33932 1726882909.22126: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.22138: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.22340: variable 'type' from source: play vars 33932 1726882909.22349: variable 'state' from source: include params 33932 1726882909.22357: variable 'interface' from source: play vars 33932 1726882909.22367: variable 'current_interfaces' from source: set_fact 33932 1726882909.22380: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 33932 1726882909.22387: when evaluation is False, skipping this task 33932 1726882909.22418: variable 'item' from source: unknown 33932 1726882909.22488: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set lsr101 up", "skip_reason": "Conditional result was False" } 33932 1726882909.22581: dumping result to json 33932 1726882909.22592: done dumping result, returning 33932 1726882909.22600: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000940] 33932 1726882909.22608: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000940 33932 1726882909.22678: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000940 skipping: [managed_node1] => { "changed": false } MSG: All items skipped 33932 1726882909.22719: no more pending results, returning what we have 33932 1726882909.22722: results queue empty 33932 1726882909.22723: checking for any_errors_fatal 33932 1726882909.22735: done checking for any_errors_fatal 33932 1726882909.22736: checking for max_fail_percentage 33932 1726882909.22738: done checking for max_fail_percentage 33932 1726882909.22739: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.22740: done checking to see if all hosts have failed 33932 1726882909.22740: getting the remaining hosts for this loop 33932 1726882909.22742: done getting the remaining hosts for this loop 33932 1726882909.22746: getting the next task for host managed_node1 33932 1726882909.22753: done getting next task for host managed_node1 33932 1726882909.22756: ^ task is: TASK: Set up veth as managed by NetworkManager 33932 1726882909.22759: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.22762: getting variables 33932 1726882909.22766: in VariableManager get_vars() 33932 1726882909.22813: Calling all_inventory to load vars for managed_node1 33932 1726882909.22816: Calling groups_inventory to load vars for managed_node1 33932 1726882909.22818: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.22830: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.22833: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.22835: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.25306: WORKER PROCESS EXITING 33932 1726882909.26872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.30581: done with get_vars() 33932 1726882909.30611: done getting variables 33932 1726882909.30673: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:41:49 -0400 (0:00:00.132) 0:00:29.774 ****** 33932 1726882909.30710: entering _queue_task() for managed_node1/command 33932 1726882909.31032: worker is 1 (out of 1 available) 33932 1726882909.31047: exiting _queue_task() for managed_node1/command 33932 1726882909.31058: done queuing things up, now waiting for results queue to drain 33932 1726882909.31060: waiting for pending results... 33932 1726882909.31679: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 33932 1726882909.31805: in run() - task 0e448fcc-3ce9-615b-5c48-000000000941 33932 1726882909.31828: variable 'ansible_search_path' from source: unknown 33932 1726882909.31836: variable 'ansible_search_path' from source: unknown 33932 1726882909.31888: calling self._execute() 33932 1726882909.32011: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.32028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.32043: variable 'omit' from source: magic vars 33932 1726882909.32438: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.32462: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.32638: variable 'type' from source: play vars 33932 1726882909.32649: variable 'state' from source: include params 33932 1726882909.32658: Evaluated conditional (type == 'veth' and state == 'present'): False 33932 1726882909.32678: when evaluation is False, skipping this task 33932 1726882909.32687: _execute() done 33932 1726882909.32694: dumping result to json 33932 1726882909.32702: done dumping result, returning 33932 1726882909.32712: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-615b-5c48-000000000941] 33932 1726882909.32722: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000941 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 33932 1726882909.32873: no more pending results, returning what we have 33932 1726882909.32878: results queue empty 33932 1726882909.32879: checking for any_errors_fatal 33932 1726882909.32891: done checking for any_errors_fatal 33932 1726882909.32891: checking for max_fail_percentage 33932 1726882909.32894: done checking for max_fail_percentage 33932 1726882909.32895: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.32896: done checking to see if all hosts have failed 33932 1726882909.32897: getting the remaining hosts for this loop 33932 1726882909.32899: done getting the remaining hosts for this loop 33932 1726882909.32903: getting the next task for host managed_node1 33932 1726882909.32911: done getting next task for host managed_node1 33932 1726882909.32913: ^ task is: TASK: Delete veth interface {{ interface }} 33932 1726882909.32917: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.32922: getting variables 33932 1726882909.32924: in VariableManager get_vars() 33932 1726882909.32975: Calling all_inventory to load vars for managed_node1 33932 1726882909.32978: Calling groups_inventory to load vars for managed_node1 33932 1726882909.32981: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.32996: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.32999: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.33002: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.34003: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000941 33932 1726882909.34007: WORKER PROCESS EXITING 33932 1726882909.35444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.38602: done with get_vars() 33932 1726882909.38627: done getting variables 33932 1726882909.39298: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882909.39412: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:41:49 -0400 (0:00:00.087) 0:00:29.862 ****** 33932 1726882909.39442: entering _queue_task() for managed_node1/command 33932 1726882909.40145: worker is 1 (out of 1 available) 33932 1726882909.40158: exiting _queue_task() for managed_node1/command 33932 1726882909.40173: done queuing things up, now waiting for results queue to drain 33932 1726882909.40176: waiting for pending results... 33932 1726882909.40830: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 33932 1726882909.41339: in run() - task 0e448fcc-3ce9-615b-5c48-000000000942 33932 1726882909.41350: variable 'ansible_search_path' from source: unknown 33932 1726882909.41353: variable 'ansible_search_path' from source: unknown 33932 1726882909.41392: calling self._execute() 33932 1726882909.41484: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.41488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.41497: variable 'omit' from source: magic vars 33932 1726882909.41852: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.41865: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.42070: variable 'type' from source: play vars 33932 1726882909.42074: variable 'state' from source: include params 33932 1726882909.42076: variable 'interface' from source: play vars 33932 1726882909.42085: variable 'current_interfaces' from source: set_fact 33932 1726882909.42093: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 33932 1726882909.42099: variable 'omit' from source: magic vars 33932 1726882909.42135: variable 'omit' from source: magic vars 33932 1726882909.42232: variable 'interface' from source: play vars 33932 1726882909.42248: variable 'omit' from source: magic vars 33932 1726882909.42291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882909.42328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882909.42357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882909.42366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882909.42380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882909.42415: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882909.42418: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.42421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.42529: Set connection var ansible_shell_executable to /bin/sh 33932 1726882909.42537: Set connection var ansible_timeout to 10 33932 1726882909.42542: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882909.42547: Set connection var ansible_pipelining to False 33932 1726882909.42550: Set connection var ansible_connection to ssh 33932 1726882909.42552: Set connection var ansible_shell_type to sh 33932 1726882909.42578: variable 'ansible_shell_executable' from source: unknown 33932 1726882909.42582: variable 'ansible_connection' from source: unknown 33932 1726882909.42584: variable 'ansible_module_compression' from source: unknown 33932 1726882909.42586: variable 'ansible_shell_type' from source: unknown 33932 1726882909.42589: variable 'ansible_shell_executable' from source: unknown 33932 1726882909.42591: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.42593: variable 'ansible_pipelining' from source: unknown 33932 1726882909.42597: variable 'ansible_timeout' from source: unknown 33932 1726882909.42601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.42744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882909.42753: variable 'omit' from source: magic vars 33932 1726882909.42757: starting attempt loop 33932 1726882909.42760: running the handler 33932 1726882909.42778: _low_level_execute_command(): starting 33932 1726882909.42786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882909.43549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882909.43560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.43578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.43593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.43637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.43644: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882909.43654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.43673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882909.43679: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882909.43687: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882909.43695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.43705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.43718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.43729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.43737: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882909.43746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.43819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882909.43843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882909.43855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882909.43986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.45647: stdout chunk (state=3): >>>/root <<< 33932 1726882909.45779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882909.45831: stderr chunk (state=3): >>><<< 33932 1726882909.45834: stdout chunk (state=3): >>><<< 33932 1726882909.45857: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882909.45873: _low_level_execute_command(): starting 33932 1726882909.45876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487 `" && echo ansible-tmp-1726882909.4585538-35261-3663003457487="` echo /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487 `" ) && sleep 0' 33932 1726882909.46914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882909.46923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.46938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.46951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.46990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.47002: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882909.47006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.47020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882909.47027: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882909.47036: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882909.47046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.47056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.47072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.47078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.47086: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882909.47095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.47172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882909.47186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882909.47197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882909.47312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.49162: stdout chunk (state=3): >>>ansible-tmp-1726882909.4585538-35261-3663003457487=/root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487 <<< 33932 1726882909.49327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882909.49330: stdout chunk (state=3): >>><<< 33932 1726882909.49337: stderr chunk (state=3): >>><<< 33932 1726882909.49355: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882909.4585538-35261-3663003457487=/root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882909.49385: variable 'ansible_module_compression' from source: unknown 33932 1726882909.49435: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882909.49470: variable 'ansible_facts' from source: unknown 33932 1726882909.49554: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487/AnsiballZ_command.py 33932 1726882909.49776: Sending initial data 33932 1726882909.49779: Sent initial data (154 bytes) 33932 1726882909.50806: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882909.50813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.50824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.50837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.50877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.50884: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882909.50898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.50911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882909.50919: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882909.50924: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882909.50932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.50941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.50953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.50960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.50971: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882909.50978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.51050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882909.51063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882909.51077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882909.51196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.52915: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882909.53001: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882909.53095: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpmspzili4 /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487/AnsiballZ_command.py <<< 33932 1726882909.53187: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882909.54584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882909.54658: stderr chunk (state=3): >>><<< 33932 1726882909.54662: stdout chunk (state=3): >>><<< 33932 1726882909.54685: done transferring module to remote 33932 1726882909.54695: _low_level_execute_command(): starting 33932 1726882909.54699: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487/ /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487/AnsiballZ_command.py && sleep 0' 33932 1726882909.55308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882909.55323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.55334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.55347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.55385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.55393: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882909.55403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.55416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882909.55430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882909.55436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882909.55444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.55454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.55469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.55482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.55489: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882909.55498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.55576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882909.55593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882909.55604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882909.55735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.57458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882909.57574: stderr chunk (state=3): >>><<< 33932 1726882909.57578: stdout chunk (state=3): >>><<< 33932 1726882909.57581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882909.57583: _low_level_execute_command(): starting 33932 1726882909.57585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487/AnsiballZ_command.py && sleep 0' 33932 1726882909.59029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.59036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.59085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.59090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.59103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.59110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.59207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882909.59225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882909.59346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.73431: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-20 21:41:49.721915", "end": "2024-09-20 21:41:49.731399", "delta": "0:00:00.009484", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882909.74933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882909.74937: stdout chunk (state=3): >>><<< 33932 1726882909.74944: stderr chunk (state=3): >>><<< 33932 1726882909.74973: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-20 21:41:49.721915", "end": "2024-09-20 21:41:49.731399", "delta": "0:00:00.009484", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882909.75011: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr101 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882909.75018: _low_level_execute_command(): starting 33932 1726882909.75024: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882909.4585538-35261-3663003457487/ > /dev/null 2>&1 && sleep 0' 33932 1726882909.75670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33932 1726882909.75679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.75690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.75704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.75740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.75748: stderr chunk (state=3): >>>debug2: match not found <<< 33932 1726882909.75765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.75781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33932 1726882909.75788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 33932 1726882909.75795: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 33932 1726882909.75803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882909.75812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882909.75823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882909.75830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 33932 1726882909.75837: stderr chunk (state=3): >>>debug2: match found <<< 33932 1726882909.75846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882909.75926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882909.75943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882909.75954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882909.76076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882909.77883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882909.77975: stderr chunk (state=3): >>><<< 33932 1726882909.77979: stdout chunk (state=3): >>><<< 33932 1726882909.78279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882909.78283: handler run complete 33932 1726882909.78285: Evaluated conditional (False): False 33932 1726882909.78287: attempt loop complete, returning result 33932 1726882909.78289: _execute() done 33932 1726882909.78291: dumping result to json 33932 1726882909.78293: done dumping result, returning 33932 1726882909.78295: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000942] 33932 1726882909.78297: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000942 33932 1726882909.78366: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000942 33932 1726882909.78372: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr101", "type", "veth" ], "delta": "0:00:00.009484", "end": "2024-09-20 21:41:49.731399", "rc": 0, "start": "2024-09-20 21:41:49.721915" } 33932 1726882909.78447: no more pending results, returning what we have 33932 1726882909.78451: results queue empty 33932 1726882909.78452: checking for any_errors_fatal 33932 1726882909.78461: done checking for any_errors_fatal 33932 1726882909.78461: checking for max_fail_percentage 33932 1726882909.78471: done checking for max_fail_percentage 33932 1726882909.78473: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.78474: done checking to see if all hosts have failed 33932 1726882909.78475: getting the remaining hosts for this loop 33932 1726882909.78477: done getting the remaining hosts for this loop 33932 1726882909.78481: getting the next task for host managed_node1 33932 1726882909.78488: done getting next task for host managed_node1 33932 1726882909.78492: ^ task is: TASK: Create dummy interface {{ interface }} 33932 1726882909.78495: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.78500: getting variables 33932 1726882909.78502: in VariableManager get_vars() 33932 1726882909.78547: Calling all_inventory to load vars for managed_node1 33932 1726882909.78550: Calling groups_inventory to load vars for managed_node1 33932 1726882909.78554: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.78570: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.78574: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.78580: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.79724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.80742: done with get_vars() 33932 1726882909.80798: done getting variables 33932 1726882909.80858: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882909.80996: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:41:49 -0400 (0:00:00.415) 0:00:30.277 ****** 33932 1726882909.81028: entering _queue_task() for managed_node1/command 33932 1726882909.81362: worker is 1 (out of 1 available) 33932 1726882909.81396: exiting _queue_task() for managed_node1/command 33932 1726882909.81408: done queuing things up, now waiting for results queue to drain 33932 1726882909.81417: waiting for pending results... 33932 1726882909.81794: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 33932 1726882909.81861: in run() - task 0e448fcc-3ce9-615b-5c48-000000000943 33932 1726882909.81878: variable 'ansible_search_path' from source: unknown 33932 1726882909.81883: variable 'ansible_search_path' from source: unknown 33932 1726882909.81914: calling self._execute() 33932 1726882909.81994: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.82000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.82007: variable 'omit' from source: magic vars 33932 1726882909.82274: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.82285: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.82419: variable 'type' from source: play vars 33932 1726882909.82425: variable 'state' from source: include params 33932 1726882909.82428: variable 'interface' from source: play vars 33932 1726882909.82433: variable 'current_interfaces' from source: set_fact 33932 1726882909.82440: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 33932 1726882909.82443: when evaluation is False, skipping this task 33932 1726882909.82445: _execute() done 33932 1726882909.82447: dumping result to json 33932 1726882909.82451: done dumping result, returning 33932 1726882909.82457: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000943] 33932 1726882909.82462: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000943 33932 1726882909.82545: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000943 33932 1726882909.82547: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882909.82598: no more pending results, returning what we have 33932 1726882909.82602: results queue empty 33932 1726882909.82603: checking for any_errors_fatal 33932 1726882909.82612: done checking for any_errors_fatal 33932 1726882909.82612: checking for max_fail_percentage 33932 1726882909.82614: done checking for max_fail_percentage 33932 1726882909.82615: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.82616: done checking to see if all hosts have failed 33932 1726882909.82616: getting the remaining hosts for this loop 33932 1726882909.82618: done getting the remaining hosts for this loop 33932 1726882909.82621: getting the next task for host managed_node1 33932 1726882909.82627: done getting next task for host managed_node1 33932 1726882909.82629: ^ task is: TASK: Delete dummy interface {{ interface }} 33932 1726882909.82632: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.82635: getting variables 33932 1726882909.82636: in VariableManager get_vars() 33932 1726882909.82675: Calling all_inventory to load vars for managed_node1 33932 1726882909.82679: Calling groups_inventory to load vars for managed_node1 33932 1726882909.82681: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.82690: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.82693: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.82695: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.83493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.85320: done with get_vars() 33932 1726882909.85359: done getting variables 33932 1726882909.85441: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882909.85563: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:41:49 -0400 (0:00:00.045) 0:00:30.323 ****** 33932 1726882909.85597: entering _queue_task() for managed_node1/command 33932 1726882909.85833: worker is 1 (out of 1 available) 33932 1726882909.85847: exiting _queue_task() for managed_node1/command 33932 1726882909.85858: done queuing things up, now waiting for results queue to drain 33932 1726882909.85860: waiting for pending results... 33932 1726882909.86042: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 33932 1726882909.86114: in run() - task 0e448fcc-3ce9-615b-5c48-000000000944 33932 1726882909.86126: variable 'ansible_search_path' from source: unknown 33932 1726882909.86129: variable 'ansible_search_path' from source: unknown 33932 1726882909.86159: calling self._execute() 33932 1726882909.86237: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.86241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.86250: variable 'omit' from source: magic vars 33932 1726882909.86522: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.86533: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.86667: variable 'type' from source: play vars 33932 1726882909.86673: variable 'state' from source: include params 33932 1726882909.86676: variable 'interface' from source: play vars 33932 1726882909.86681: variable 'current_interfaces' from source: set_fact 33932 1726882909.86689: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 33932 1726882909.86692: when evaluation is False, skipping this task 33932 1726882909.86694: _execute() done 33932 1726882909.86697: dumping result to json 33932 1726882909.86699: done dumping result, returning 33932 1726882909.86705: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000944] 33932 1726882909.86710: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000944 33932 1726882909.86792: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000944 33932 1726882909.86794: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882909.86841: no more pending results, returning what we have 33932 1726882909.86844: results queue empty 33932 1726882909.86845: checking for any_errors_fatal 33932 1726882909.86850: done checking for any_errors_fatal 33932 1726882909.86851: checking for max_fail_percentage 33932 1726882909.86853: done checking for max_fail_percentage 33932 1726882909.86854: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.86855: done checking to see if all hosts have failed 33932 1726882909.86855: getting the remaining hosts for this loop 33932 1726882909.86857: done getting the remaining hosts for this loop 33932 1726882909.86861: getting the next task for host managed_node1 33932 1726882909.86868: done getting next task for host managed_node1 33932 1726882909.86871: ^ task is: TASK: Create tap interface {{ interface }} 33932 1726882909.86874: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.86877: getting variables 33932 1726882909.86878: in VariableManager get_vars() 33932 1726882909.86921: Calling all_inventory to load vars for managed_node1 33932 1726882909.86924: Calling groups_inventory to load vars for managed_node1 33932 1726882909.86926: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.86936: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.86939: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.86941: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.87935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.89452: done with get_vars() 33932 1726882909.89482: done getting variables 33932 1726882909.89543: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882909.89650: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:41:49 -0400 (0:00:00.040) 0:00:30.364 ****** 33932 1726882909.89685: entering _queue_task() for managed_node1/command 33932 1726882909.89970: worker is 1 (out of 1 available) 33932 1726882909.89983: exiting _queue_task() for managed_node1/command 33932 1726882909.89994: done queuing things up, now waiting for results queue to drain 33932 1726882909.89995: waiting for pending results... 33932 1726882909.90294: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 33932 1726882909.90413: in run() - task 0e448fcc-3ce9-615b-5c48-000000000945 33932 1726882909.90419: variable 'ansible_search_path' from source: unknown 33932 1726882909.90422: variable 'ansible_search_path' from source: unknown 33932 1726882909.90462: calling self._execute() 33932 1726882909.90535: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.90539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.90552: variable 'omit' from source: magic vars 33932 1726882909.90825: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.90835: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.90965: variable 'type' from source: play vars 33932 1726882909.90971: variable 'state' from source: include params 33932 1726882909.90976: variable 'interface' from source: play vars 33932 1726882909.90978: variable 'current_interfaces' from source: set_fact 33932 1726882909.90984: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 33932 1726882909.90987: when evaluation is False, skipping this task 33932 1726882909.90990: _execute() done 33932 1726882909.90992: dumping result to json 33932 1726882909.90995: done dumping result, returning 33932 1726882909.91002: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000945] 33932 1726882909.91007: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000945 33932 1726882909.91091: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000945 33932 1726882909.91093: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882909.91139: no more pending results, returning what we have 33932 1726882909.91143: results queue empty 33932 1726882909.91144: checking for any_errors_fatal 33932 1726882909.91150: done checking for any_errors_fatal 33932 1726882909.91150: checking for max_fail_percentage 33932 1726882909.91153: done checking for max_fail_percentage 33932 1726882909.91154: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.91155: done checking to see if all hosts have failed 33932 1726882909.91155: getting the remaining hosts for this loop 33932 1726882909.91157: done getting the remaining hosts for this loop 33932 1726882909.91160: getting the next task for host managed_node1 33932 1726882909.91169: done getting next task for host managed_node1 33932 1726882909.91172: ^ task is: TASK: Delete tap interface {{ interface }} 33932 1726882909.91175: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.91179: getting variables 33932 1726882909.91180: in VariableManager get_vars() 33932 1726882909.91215: Calling all_inventory to load vars for managed_node1 33932 1726882909.91217: Calling groups_inventory to load vars for managed_node1 33932 1726882909.91219: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.91229: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.91231: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.91233: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.92218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.93686: done with get_vars() 33932 1726882909.93702: done getting variables 33932 1726882909.93743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 33932 1726882909.93823: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:41:49 -0400 (0:00:00.041) 0:00:30.406 ****** 33932 1726882909.93845: entering _queue_task() for managed_node1/command 33932 1726882909.94062: worker is 1 (out of 1 available) 33932 1726882909.94079: exiting _queue_task() for managed_node1/command 33932 1726882909.94091: done queuing things up, now waiting for results queue to drain 33932 1726882909.94092: waiting for pending results... 33932 1726882909.94263: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 33932 1726882909.94333: in run() - task 0e448fcc-3ce9-615b-5c48-000000000946 33932 1726882909.94344: variable 'ansible_search_path' from source: unknown 33932 1726882909.94348: variable 'ansible_search_path' from source: unknown 33932 1726882909.94378: calling self._execute() 33932 1726882909.94452: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.94455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.94465: variable 'omit' from source: magic vars 33932 1726882909.94737: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.94744: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.94874: variable 'type' from source: play vars 33932 1726882909.94877: variable 'state' from source: include params 33932 1726882909.94882: variable 'interface' from source: play vars 33932 1726882909.94885: variable 'current_interfaces' from source: set_fact 33932 1726882909.94893: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 33932 1726882909.94896: when evaluation is False, skipping this task 33932 1726882909.94898: _execute() done 33932 1726882909.94901: dumping result to json 33932 1726882909.94903: done dumping result, returning 33932 1726882909.94909: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 [0e448fcc-3ce9-615b-5c48-000000000946] 33932 1726882909.94914: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000946 33932 1726882909.95003: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000946 33932 1726882909.95006: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 33932 1726882909.95048: no more pending results, returning what we have 33932 1726882909.95051: results queue empty 33932 1726882909.95052: checking for any_errors_fatal 33932 1726882909.95059: done checking for any_errors_fatal 33932 1726882909.95059: checking for max_fail_percentage 33932 1726882909.95061: done checking for max_fail_percentage 33932 1726882909.95062: checking to see if all hosts have failed and the running result is not ok 33932 1726882909.95062: done checking to see if all hosts have failed 33932 1726882909.95065: getting the remaining hosts for this loop 33932 1726882909.95066: done getting the remaining hosts for this loop 33932 1726882909.95072: getting the next task for host managed_node1 33932 1726882909.95079: done getting next task for host managed_node1 33932 1726882909.95087: ^ task is: TASK: Verify network state restored to default 33932 1726882909.95090: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882909.95093: getting variables 33932 1726882909.95095: in VariableManager get_vars() 33932 1726882909.95127: Calling all_inventory to load vars for managed_node1 33932 1726882909.95130: Calling groups_inventory to load vars for managed_node1 33932 1726882909.95132: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.95141: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.95144: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.95146: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.95924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.96848: done with get_vars() 33932 1726882909.96865: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:77 Friday 20 September 2024 21:41:49 -0400 (0:00:00.030) 0:00:30.436 ****** 33932 1726882909.96932: entering _queue_task() for managed_node1/include_tasks 33932 1726882909.97133: worker is 1 (out of 1 available) 33932 1726882909.97146: exiting _queue_task() for managed_node1/include_tasks 33932 1726882909.97158: done queuing things up, now waiting for results queue to drain 33932 1726882909.97160: waiting for pending results... 33932 1726882909.97328: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 33932 1726882909.97393: in run() - task 0e448fcc-3ce9-615b-5c48-0000000000ab 33932 1726882909.97404: variable 'ansible_search_path' from source: unknown 33932 1726882909.97433: calling self._execute() 33932 1726882909.97503: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882909.97507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882909.97515: variable 'omit' from source: magic vars 33932 1726882909.97779: variable 'ansible_distribution_major_version' from source: facts 33932 1726882909.97788: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882909.97796: _execute() done 33932 1726882909.97800: dumping result to json 33932 1726882909.97803: done dumping result, returning 33932 1726882909.97806: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-615b-5c48-0000000000ab] 33932 1726882909.97816: sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000ab 33932 1726882909.97898: done sending task result for task 0e448fcc-3ce9-615b-5c48-0000000000ab 33932 1726882909.97901: WORKER PROCESS EXITING 33932 1726882909.97950: no more pending results, returning what we have 33932 1726882909.97954: in VariableManager get_vars() 33932 1726882909.97994: Calling all_inventory to load vars for managed_node1 33932 1726882909.97997: Calling groups_inventory to load vars for managed_node1 33932 1726882909.97999: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882909.98009: Calling all_plugins_play to load vars for managed_node1 33932 1726882909.98012: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882909.98014: Calling groups_plugins_play to load vars for managed_node1 33932 1726882909.98895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882909.99809: done with get_vars() 33932 1726882909.99822: variable 'ansible_search_path' from source: unknown 33932 1726882909.99832: we have included files to process 33932 1726882909.99832: generating all_blocks data 33932 1726882909.99834: done generating all_blocks data 33932 1726882909.99838: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 33932 1726882909.99838: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 33932 1726882909.99840: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 33932 1726882910.00105: done processing included file 33932 1726882910.00106: iterating over new_blocks loaded from include file 33932 1726882910.00107: in VariableManager get_vars() 33932 1726882910.00118: done with get_vars() 33932 1726882910.00120: filtering new block on tags 33932 1726882910.00130: done filtering new block on tags 33932 1726882910.00132: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 33932 1726882910.00135: extending task lists for all hosts with included blocks 33932 1726882910.01975: done extending task lists 33932 1726882910.01977: done processing included files 33932 1726882910.01977: results queue empty 33932 1726882910.01978: checking for any_errors_fatal 33932 1726882910.01980: done checking for any_errors_fatal 33932 1726882910.01980: checking for max_fail_percentage 33932 1726882910.01981: done checking for max_fail_percentage 33932 1726882910.01982: checking to see if all hosts have failed and the running result is not ok 33932 1726882910.01982: done checking to see if all hosts have failed 33932 1726882910.01983: getting the remaining hosts for this loop 33932 1726882910.01984: done getting the remaining hosts for this loop 33932 1726882910.01985: getting the next task for host managed_node1 33932 1726882910.01988: done getting next task for host managed_node1 33932 1726882910.01989: ^ task is: TASK: Check routes and DNS 33932 1726882910.01991: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882910.01992: getting variables 33932 1726882910.01993: in VariableManager get_vars() 33932 1726882910.02002: Calling all_inventory to load vars for managed_node1 33932 1726882910.02004: Calling groups_inventory to load vars for managed_node1 33932 1726882910.02005: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882910.02009: Calling all_plugins_play to load vars for managed_node1 33932 1726882910.02011: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882910.02012: Calling groups_plugins_play to load vars for managed_node1 33932 1726882910.02730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882910.06671: done with get_vars() 33932 1726882910.06690: done getting variables 33932 1726882910.06724: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:41:50 -0400 (0:00:00.098) 0:00:30.535 ****** 33932 1726882910.06743: entering _queue_task() for managed_node1/shell 33932 1726882910.06992: worker is 1 (out of 1 available) 33932 1726882910.07006: exiting _queue_task() for managed_node1/shell 33932 1726882910.07018: done queuing things up, now waiting for results queue to drain 33932 1726882910.07021: waiting for pending results... 33932 1726882910.07208: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 33932 1726882910.07281: in run() - task 0e448fcc-3ce9-615b-5c48-000000000b17 33932 1726882910.07292: variable 'ansible_search_path' from source: unknown 33932 1726882910.07297: variable 'ansible_search_path' from source: unknown 33932 1726882910.07325: calling self._execute() 33932 1726882910.07417: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882910.07421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882910.07428: variable 'omit' from source: magic vars 33932 1726882910.07717: variable 'ansible_distribution_major_version' from source: facts 33932 1726882910.07726: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882910.07733: variable 'omit' from source: magic vars 33932 1726882910.07760: variable 'omit' from source: magic vars 33932 1726882910.07795: variable 'omit' from source: magic vars 33932 1726882910.07828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882910.07853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882910.07872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882910.07890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882910.07899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882910.07922: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882910.07926: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882910.07929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882910.08004: Set connection var ansible_shell_executable to /bin/sh 33932 1726882910.08013: Set connection var ansible_timeout to 10 33932 1726882910.08016: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882910.08020: Set connection var ansible_pipelining to False 33932 1726882910.08023: Set connection var ansible_connection to ssh 33932 1726882910.08026: Set connection var ansible_shell_type to sh 33932 1726882910.08045: variable 'ansible_shell_executable' from source: unknown 33932 1726882910.08048: variable 'ansible_connection' from source: unknown 33932 1726882910.08050: variable 'ansible_module_compression' from source: unknown 33932 1726882910.08053: variable 'ansible_shell_type' from source: unknown 33932 1726882910.08056: variable 'ansible_shell_executable' from source: unknown 33932 1726882910.08058: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882910.08062: variable 'ansible_pipelining' from source: unknown 33932 1726882910.08067: variable 'ansible_timeout' from source: unknown 33932 1726882910.08075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882910.08178: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882910.08188: variable 'omit' from source: magic vars 33932 1726882910.08193: starting attempt loop 33932 1726882910.08196: running the handler 33932 1726882910.08226: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882910.08239: _low_level_execute_command(): starting 33932 1726882910.08247: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882910.08780: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.08792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.08814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882910.08827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882910.08839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.08887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.08900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.09015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.10705: stdout chunk (state=3): >>>/root <<< 33932 1726882910.10801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.10854: stderr chunk (state=3): >>><<< 33932 1726882910.10858: stdout chunk (state=3): >>><<< 33932 1726882910.10884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.10892: _low_level_execute_command(): starting 33932 1726882910.10898: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019 `" && echo ansible-tmp-1726882910.108802-35299-144863132010019="` echo /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019 `" ) && sleep 0' 33932 1726882910.11336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882910.11348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.11384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882910.11399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.11408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.11449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.11461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882910.11475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.11580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.13458: stdout chunk (state=3): >>>ansible-tmp-1726882910.108802-35299-144863132010019=/root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019 <<< 33932 1726882910.13574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.13619: stderr chunk (state=3): >>><<< 33932 1726882910.13625: stdout chunk (state=3): >>><<< 33932 1726882910.13642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882910.108802-35299-144863132010019=/root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.13667: variable 'ansible_module_compression' from source: unknown 33932 1726882910.13716: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882910.13748: variable 'ansible_facts' from source: unknown 33932 1726882910.13800: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019/AnsiballZ_command.py 33932 1726882910.13907: Sending initial data 33932 1726882910.13910: Sent initial data (155 bytes) 33932 1726882910.14582: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.14593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.14622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 33932 1726882910.14628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882910.14636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882910.14642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.14652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882910.14662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.14712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.14724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.14828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.16562: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882910.16651: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882910.16748: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmpy87yq6qu /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019/AnsiballZ_command.py <<< 33932 1726882910.16839: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882910.17840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.17935: stderr chunk (state=3): >>><<< 33932 1726882910.17939: stdout chunk (state=3): >>><<< 33932 1726882910.17958: done transferring module to remote 33932 1726882910.17969: _low_level_execute_command(): starting 33932 1726882910.17976: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019/ /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019/AnsiballZ_command.py && sleep 0' 33932 1726882910.18415: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882910.18418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.18457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.18461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.18465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.18513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.18517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.18616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.20362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.20413: stderr chunk (state=3): >>><<< 33932 1726882910.20416: stdout chunk (state=3): >>><<< 33932 1726882910.20426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.20429: _low_level_execute_command(): starting 33932 1726882910.20434: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019/AnsiballZ_command.py && sleep 0' 33932 1726882910.20850: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882910.20855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.20903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.20906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.20909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.20961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.20969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.21078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.35109: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3548sec preferred_lft 3548sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:41:50.341210", "end": "2024-09-20 21:41:50.349628", "delta": "0:00:00.008418", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882910.36244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882910.36306: stderr chunk (state=3): >>><<< 33932 1726882910.36312: stdout chunk (state=3): >>><<< 33932 1726882910.36331: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3548sec preferred_lft 3548sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:41:50.341210", "end": "2024-09-20 21:41:50.349628", "delta": "0:00:00.008418", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882910.36370: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882910.36375: _low_level_execute_command(): starting 33932 1726882910.36380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882910.108802-35299-144863132010019/ > /dev/null 2>&1 && sleep 0' 33932 1726882910.36848: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882910.36855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.36888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.36901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.36911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.36962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.36973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.37080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.38850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.38897: stderr chunk (state=3): >>><<< 33932 1726882910.38901: stdout chunk (state=3): >>><<< 33932 1726882910.38912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.38918: handler run complete 33932 1726882910.38936: Evaluated conditional (False): False 33932 1726882910.38944: attempt loop complete, returning result 33932 1726882910.38946: _execute() done 33932 1726882910.38949: dumping result to json 33932 1726882910.38954: done dumping result, returning 33932 1726882910.38961: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0e448fcc-3ce9-615b-5c48-000000000b17] 33932 1726882910.38967: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000b17 33932 1726882910.39071: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000b17 33932 1726882910.39073: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008418", "end": "2024-09-20 21:41:50.349628", "rc": 0, "start": "2024-09-20 21:41:50.341210" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3548sec preferred_lft 3548sec inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 33932 1726882910.39139: no more pending results, returning what we have 33932 1726882910.39142: results queue empty 33932 1726882910.39143: checking for any_errors_fatal 33932 1726882910.39145: done checking for any_errors_fatal 33932 1726882910.39145: checking for max_fail_percentage 33932 1726882910.39147: done checking for max_fail_percentage 33932 1726882910.39148: checking to see if all hosts have failed and the running result is not ok 33932 1726882910.39149: done checking to see if all hosts have failed 33932 1726882910.39149: getting the remaining hosts for this loop 33932 1726882910.39151: done getting the remaining hosts for this loop 33932 1726882910.39154: getting the next task for host managed_node1 33932 1726882910.39160: done getting next task for host managed_node1 33932 1726882910.39162: ^ task is: TASK: Verify DNS and network connectivity 33932 1726882910.39167: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882910.39171: getting variables 33932 1726882910.39173: in VariableManager get_vars() 33932 1726882910.39213: Calling all_inventory to load vars for managed_node1 33932 1726882910.39216: Calling groups_inventory to load vars for managed_node1 33932 1726882910.39221: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882910.39231: Calling all_plugins_play to load vars for managed_node1 33932 1726882910.39234: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882910.39236: Calling groups_plugins_play to load vars for managed_node1 33932 1726882910.40046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882910.40980: done with get_vars() 33932 1726882910.40995: done getting variables 33932 1726882910.41037: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:41:50 -0400 (0:00:00.343) 0:00:30.878 ****** 33932 1726882910.41059: entering _queue_task() for managed_node1/shell 33932 1726882910.41256: worker is 1 (out of 1 available) 33932 1726882910.41270: exiting _queue_task() for managed_node1/shell 33932 1726882910.41281: done queuing things up, now waiting for results queue to drain 33932 1726882910.41283: waiting for pending results... 33932 1726882910.41455: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 33932 1726882910.41534: in run() - task 0e448fcc-3ce9-615b-5c48-000000000b18 33932 1726882910.41545: variable 'ansible_search_path' from source: unknown 33932 1726882910.41548: variable 'ansible_search_path' from source: unknown 33932 1726882910.41583: calling self._execute() 33932 1726882910.41657: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882910.41661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882910.41671: variable 'omit' from source: magic vars 33932 1726882910.41944: variable 'ansible_distribution_major_version' from source: facts 33932 1726882910.41955: Evaluated conditional (ansible_distribution_major_version != '6'): True 33932 1726882910.42052: variable 'ansible_facts' from source: unknown 33932 1726882910.42498: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 33932 1726882910.42503: variable 'omit' from source: magic vars 33932 1726882910.42529: variable 'omit' from source: magic vars 33932 1726882910.42549: variable 'omit' from source: magic vars 33932 1726882910.42586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33932 1726882910.42614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33932 1726882910.42629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33932 1726882910.42642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882910.42653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33932 1726882910.42684: variable 'inventory_hostname' from source: host vars for 'managed_node1' 33932 1726882910.42688: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882910.42690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882910.42759: Set connection var ansible_shell_executable to /bin/sh 33932 1726882910.42768: Set connection var ansible_timeout to 10 33932 1726882910.42774: Set connection var ansible_module_compression to ZIP_DEFLATED 33932 1726882910.42780: Set connection var ansible_pipelining to False 33932 1726882910.42782: Set connection var ansible_connection to ssh 33932 1726882910.42785: Set connection var ansible_shell_type to sh 33932 1726882910.42803: variable 'ansible_shell_executable' from source: unknown 33932 1726882910.42807: variable 'ansible_connection' from source: unknown 33932 1726882910.42810: variable 'ansible_module_compression' from source: unknown 33932 1726882910.42813: variable 'ansible_shell_type' from source: unknown 33932 1726882910.42815: variable 'ansible_shell_executable' from source: unknown 33932 1726882910.42817: variable 'ansible_host' from source: host vars for 'managed_node1' 33932 1726882910.42819: variable 'ansible_pipelining' from source: unknown 33932 1726882910.42822: variable 'ansible_timeout' from source: unknown 33932 1726882910.42824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 33932 1726882910.42919: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882910.42928: variable 'omit' from source: magic vars 33932 1726882910.42942: starting attempt loop 33932 1726882910.42945: running the handler 33932 1726882910.42948: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 33932 1726882910.42964: _low_level_execute_command(): starting 33932 1726882910.42975: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33932 1726882910.43472: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.43476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.43511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.43515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.43573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.43577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882910.43581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.43674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.45227: stdout chunk (state=3): >>>/root <<< 33932 1726882910.45331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.45381: stderr chunk (state=3): >>><<< 33932 1726882910.45385: stdout chunk (state=3): >>><<< 33932 1726882910.45403: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.45413: _low_level_execute_command(): starting 33932 1726882910.45425: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280 `" && echo ansible-tmp-1726882910.4540107-35307-131893319656280="` echo /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280 `" ) && sleep 0' 33932 1726882910.45831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33932 1726882910.45841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.45889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.45893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882910.45896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882910.45905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.45951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.45954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.46058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.47914: stdout chunk (state=3): >>>ansible-tmp-1726882910.4540107-35307-131893319656280=/root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280 <<< 33932 1726882910.48023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.48070: stderr chunk (state=3): >>><<< 33932 1726882910.48074: stdout chunk (state=3): >>><<< 33932 1726882910.48085: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882910.4540107-35307-131893319656280=/root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.48107: variable 'ansible_module_compression' from source: unknown 33932 1726882910.48151: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33932njcnmxb6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 33932 1726882910.48182: variable 'ansible_facts' from source: unknown 33932 1726882910.48232: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280/AnsiballZ_command.py 33932 1726882910.48326: Sending initial data 33932 1726882910.48331: Sent initial data (156 bytes) 33932 1726882910.48952: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.48957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.48988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.49001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.49011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.49061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882910.49067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.49176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.50895: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 33932 1726882910.50986: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 33932 1726882910.51081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33932njcnmxb6/tmp2eu5n4lf /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280/AnsiballZ_command.py <<< 33932 1726882910.51170: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 33932 1726882910.52150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.52234: stderr chunk (state=3): >>><<< 33932 1726882910.52237: stdout chunk (state=3): >>><<< 33932 1726882910.52251: done transferring module to remote 33932 1726882910.52259: _low_level_execute_command(): starting 33932 1726882910.52262: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280/ /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280/AnsiballZ_command.py && sleep 0' 33932 1726882910.52669: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.52682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.52703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.52714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.52767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.52791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882910.52794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.52889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.54641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.54684: stderr chunk (state=3): >>><<< 33932 1726882910.54689: stdout chunk (state=3): >>><<< 33932 1726882910.54700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.54706: _low_level_execute_command(): starting 33932 1726882910.54711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280/AnsiballZ_command.py && sleep 0' 33932 1726882910.55103: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.55108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.55145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.55151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.55157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.55162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 33932 1726882910.55176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.55220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.55232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.55344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.95212: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1405\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 12125 0 --:--:-- --:--:-- --:--:-- 12125", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:41:50.683346", "end": "2024-09-20 21:41:50.950222", "delta": "0:00:00.266876", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 33932 1726882910.96490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 33932 1726882910.96543: stderr chunk (state=3): >>><<< 33932 1726882910.96547: stdout chunk (state=3): >>><<< 33932 1726882910.96574: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1405\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 12125 0 --:--:-- --:--:-- --:--:-- 12125", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:41:50.683346", "end": "2024-09-20 21:41:50.950222", "delta": "0:00:00.266876", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 33932 1726882910.96606: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33932 1726882910.96613: _low_level_execute_command(): starting 33932 1726882910.96618: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882910.4540107-35307-131893319656280/ > /dev/null 2>&1 && sleep 0' 33932 1726882910.97073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33932 1726882910.97078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33932 1726882910.97101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 33932 1726882910.97105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33932 1726882910.97159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 33932 1726882910.97170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33932 1726882910.97173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33932 1726882910.97260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33932 1726882910.99143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33932 1726882910.99644: stderr chunk (state=3): >>><<< 33932 1726882910.99647: stdout chunk (state=3): >>><<< 33932 1726882910.99650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33932 1726882910.99652: handler run complete 33932 1726882910.99654: Evaluated conditional (False): False 33932 1726882910.99656: attempt loop complete, returning result 33932 1726882910.99658: _execute() done 33932 1726882910.99660: dumping result to json 33932 1726882910.99661: done dumping result, returning 33932 1726882910.99665: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-615b-5c48-000000000b18] 33932 1726882910.99670: sending task result for task 0e448fcc-3ce9-615b-5c48-000000000b18 33932 1726882910.99740: done sending task result for task 0e448fcc-3ce9-615b-5c48-000000000b18 33932 1726882910.99742: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.266876", "end": "2024-09-20 21:41:50.950222", "rc": 0, "start": "2024-09-20 21:41:50.683346" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1405 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 12125 0 --:--:-- --:--:-- --:--:-- 12125 33932 1726882910.99867: no more pending results, returning what we have 33932 1726882910.99870: results queue empty 33932 1726882910.99871: checking for any_errors_fatal 33932 1726882910.99882: done checking for any_errors_fatal 33932 1726882910.99883: checking for max_fail_percentage 33932 1726882910.99884: done checking for max_fail_percentage 33932 1726882910.99885: checking to see if all hosts have failed and the running result is not ok 33932 1726882910.99886: done checking to see if all hosts have failed 33932 1726882910.99887: getting the remaining hosts for this loop 33932 1726882910.99888: done getting the remaining hosts for this loop 33932 1726882910.99891: getting the next task for host managed_node1 33932 1726882910.99902: done getting next task for host managed_node1 33932 1726882910.99904: ^ task is: TASK: meta (flush_handlers) 33932 1726882910.99906: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882910.99911: getting variables 33932 1726882910.99912: in VariableManager get_vars() 33932 1726882910.99949: Calling all_inventory to load vars for managed_node1 33932 1726882910.99951: Calling groups_inventory to load vars for managed_node1 33932 1726882910.99953: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882910.99965: Calling all_plugins_play to load vars for managed_node1 33932 1726882910.99967: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882910.99970: Calling groups_plugins_play to load vars for managed_node1 33932 1726882911.01577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882911.03359: done with get_vars() 33932 1726882911.03382: done getting variables 33932 1726882911.03452: in VariableManager get_vars() 33932 1726882911.03468: Calling all_inventory to load vars for managed_node1 33932 1726882911.03470: Calling groups_inventory to load vars for managed_node1 33932 1726882911.03472: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882911.03477: Calling all_plugins_play to load vars for managed_node1 33932 1726882911.03480: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882911.03482: Calling groups_plugins_play to load vars for managed_node1 33932 1726882911.04705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882911.06423: done with get_vars() 33932 1726882911.06452: done queuing things up, now waiting for results queue to drain 33932 1726882911.06454: results queue empty 33932 1726882911.06455: checking for any_errors_fatal 33932 1726882911.06459: done checking for any_errors_fatal 33932 1726882911.06460: checking for max_fail_percentage 33932 1726882911.06461: done checking for max_fail_percentage 33932 1726882911.06462: checking to see if all hosts have failed and the running result is not ok 33932 1726882911.06462: done checking to see if all hosts have failed 33932 1726882911.06465: getting the remaining hosts for this loop 33932 1726882911.06466: done getting the remaining hosts for this loop 33932 1726882911.06469: getting the next task for host managed_node1 33932 1726882911.06473: done getting next task for host managed_node1 33932 1726882911.06474: ^ task is: TASK: meta (flush_handlers) 33932 1726882911.06476: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882911.06478: getting variables 33932 1726882911.06479: in VariableManager get_vars() 33932 1726882911.06497: Calling all_inventory to load vars for managed_node1 33932 1726882911.06499: Calling groups_inventory to load vars for managed_node1 33932 1726882911.06501: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882911.06506: Calling all_plugins_play to load vars for managed_node1 33932 1726882911.06509: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882911.06511: Calling groups_plugins_play to load vars for managed_node1 33932 1726882911.07567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882911.08473: done with get_vars() 33932 1726882911.08487: done getting variables 33932 1726882911.08520: in VariableManager get_vars() 33932 1726882911.08529: Calling all_inventory to load vars for managed_node1 33932 1726882911.08530: Calling groups_inventory to load vars for managed_node1 33932 1726882911.08532: Calling all_plugins_inventory to load vars for managed_node1 33932 1726882911.08535: Calling all_plugins_play to load vars for managed_node1 33932 1726882911.08536: Calling groups_plugins_inventory to load vars for managed_node1 33932 1726882911.08538: Calling groups_plugins_play to load vars for managed_node1 33932 1726882911.09381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33932 1726882911.10791: done with get_vars() 33932 1726882911.10809: done queuing things up, now waiting for results queue to drain 33932 1726882911.10811: results queue empty 33932 1726882911.10812: checking for any_errors_fatal 33932 1726882911.10813: done checking for any_errors_fatal 33932 1726882911.10813: checking for max_fail_percentage 33932 1726882911.10814: done checking for max_fail_percentage 33932 1726882911.10814: checking to see if all hosts have failed and the running result is not ok 33932 1726882911.10815: done checking to see if all hosts have failed 33932 1726882911.10815: getting the remaining hosts for this loop 33932 1726882911.10816: done getting the remaining hosts for this loop 33932 1726882911.10818: getting the next task for host managed_node1 33932 1726882911.10820: done getting next task for host managed_node1 33932 1726882911.10821: ^ task is: None 33932 1726882911.10822: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33932 1726882911.10822: done queuing things up, now waiting for results queue to drain 33932 1726882911.10823: results queue empty 33932 1726882911.10823: checking for any_errors_fatal 33932 1726882911.10824: done checking for any_errors_fatal 33932 1726882911.10824: checking for max_fail_percentage 33932 1726882911.10825: done checking for max_fail_percentage 33932 1726882911.10825: checking to see if all hosts have failed and the running result is not ok 33932 1726882911.10826: done checking to see if all hosts have failed 33932 1726882911.10827: getting the next task for host managed_node1 33932 1726882911.10829: done getting next task for host managed_node1 33932 1726882911.10829: ^ task is: None 33932 1726882911.10830: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=79 changed=2 unreachable=0 failed=0 skipped=67 rescued=0 ignored=0 Friday 20 September 2024 21:41:51 -0400 (0:00:00.698) 0:00:31.576 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.65s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 1.58s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.48s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Install iproute --------------------------------------------------------- 1.40s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Create veth interface lsr101 -------------------------------------------- 1.27s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.12s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Check if system is ostree ----------------------------------------------- 0.83s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather the minimum subset of ansible_facts required by the network role test --- 0.81s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.79s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.79s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.75s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Verify DNS and network connectivity ------------------------------------- 0.70s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.60s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.59s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather current interface info ------------------------------------------- 0.54s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.52s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Set up veth as managed by NetworkManager -------------------------------- 0.51s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 33932 1726882911.10926: RUNNING CLEANUP